Evanalysis
MATH1030

MATH1030: Linear algebra I

Linear algebra notes.

Use the sidebar to move chapter by chapter, or jump directly into a section below.

9 Chapter
Each section stays readable on the page and exports as a static study copy when you need an offline version.

Course contents

MATH1030: Linear algebra I

Linear algebra notes.

37 sections

Chapter 1

Systems of equations

Learn to read equations as full solution sets.

1.1Embedded interaction

1.1 Equations and solution sets

Read a linear system as a collection of conditions and describe its full solution set carefully.

1.2

1.2 Reading theorems and proof language

Learn how to read definitions, theorem statements, equivalences, uniqueness claims, and counterexamples before the course becomes proof-heavy.

Chapter 2

Matrices and elimination

Build matrix intuition and use row reduction with purpose.

2.1Embedded interaction

2.1 Matrix basics

Build matrix intuition before you row-reduce: size, entries, rows, columns, and arithmetic meaning.

2.2Embedded interaction

2.2 Augmented matrices and row operations

Translate a system into an augmented matrix and understand what each row operation preserves.

2.3Embedded interaction

2.3 Gaussian elimination and RREF

See Gaussian elimination as a sequence of purposeful moves, not just memorized mechanics.

2.4Embedded interaction

2.4 Solution-set types

Classify whether a system has one solution, infinitely many solutions, or no solution by reading its reduced form.

2.5

2.5 Existence of row-echelon forms

Read the optional induction proof that every matrix is row-equivalent to an REF and then to an RREF.

Chapter 3

Matrix algebra

Matrix multiplication, transpose, and structural matrix notation.

3.1Embedded interaction

3.1 Matrix multiplication and identity matrices

Learn when matrix products are defined, how the row-by-column rule works, and why the identity matrix matters for solving linear systems.

3.2

3.2 Transpose and special matrices

Use transpose, symmetry, commuting products, and block notation to read matrix structure rather than treating formulas as isolated tricks.

3.3

3.3 Row-operation matrices

Represent elementary row operations as left multiplication by elementary matrices, then use reverse row operations to understand invertibility.

3.1Embedded interaction

3.1 Matrix addition, subtraction, and scalar multiplication

Learn which matrix operations are entrywise, why matching sizes matter, and how the zero matrix behaves like the additive identity.

3.2Embedded interaction

3.2 Matrix multiplication and linear systems

Understand matrix multiplication as a row-by-column rule and as a compact way to encode several linear combinations or systems at once.

3.3Embedded interaction

3.3 Transposes, symmetric matrices, and skew-symmetric matrices

Transpose swaps rows and columns, while symmetry records what stays unchanged across the main diagonal.

3.4Embedded interaction

3.4 Special matrices

Meet diagonal, triangular, identity, and elementary matrices, and see why their special shapes make later arguments shorter.

3.5

3.5 Block matrices

Partition a large matrix into smaller pieces so addition and multiplication can be carried out block by block without losing meaning.

Chapter 4

Solution structure

Homogeneous systems, null spaces, and the shape of full solution sets.

4.1

4.1 Homogeneous systems and null space

Study homogeneous systems carefully, then use null spaces to describe every solution as a structured set rather than a loose list of examples.

4.2

4.2 Set language and solution sets

Use set notation, membership, solution sets, null spaces, spans, and set equality carefully in linear algebra arguments.

Chapter 5

Invertibility

Understand when a matrix can be undone and why that matters.

5.1Embedded interactionMember

5.1 Invertible matrices

Connect inverse matrices, row reduction, and the practical meaning of nonsingularity.

5.2Member

5.2 RREF uniqueness and well-defined rank

Prove uniqueness of reduced row-echelon form and use it to make rank independent of the chosen row-reduction path.

Chapter 6

Vector spaces

Move from matrix procedures to the structure of spaces, span, independence, and basis.

6.1Member

6.1 Vector spaces

Start from familiar examples and learn what the vector-space axioms are trying to protect.

6.2Embedded interactionMember

6.2 Subspaces

Use the subspace test to separate genuine linear structure from lookalikes that fail closure or miss the zero vector.

6.3Embedded interactionMember

6.3 Linear combinations and span

Treat linear combinations as controlled building instructions, then see span as every vector you can build that way.

6.4Embedded interactionMember

6.4 Linear dependence and independence

Read dependence as redundancy, and independence as the point where every coefficient truly matters.

6.5Embedded interactionMember

6.5 Basis and dimension

See why a basis is the smallest complete coordinate system for a space, and why dimension counts how many directions are really needed.

6.6Member

6.6 Column space, row space, and rank

Use row reduction and basis ideas together to read column space, row space, and rank without confusing what row operations actually preserve.

6.7Member

6.7 Matrix subspaces, basis, and dimension

Extend span, basis, and dimension from column-vector subspaces to matrix subspaces such as all matrices, upper triangular matrices, and skew-symmetric matrices.

6.8Member

6.8 Basis extension and change of basis

Use basis existence, the replacement theorem, and change-of-basis matrices to compare coordinate systems rigorously.

Chapter 7

Determinants

Determinants, cofactor formulas, and the structural algebra that connects row operations, transpose, and invertibility.

7.1Member

7.1 Determinants and cofactor expansion

Define determinants carefully through minors and cofactors, then learn how cofactor expansion turns one scalar into a precise summary of square-matrix structure.

7.2Member

7.2 Row operations, products, and invertibility

Track exactly how row operations change determinants, then connect that behavior to multiplicativity, inverse matrices, and invertibility tests.

7.3Member

7.3 Transpose, column operations, and Cramer's rule

Use transpose and column operations to read determinants from a second angle, then finish the chapter with adjoints, inverse formulas, and Cramer's rule.

Chapter 8

Eigenvalues and diagonalization

Eigenvalues, eigenspaces, similarity, and diagonalization as the next structural layer after determinants.

8.1Member

8.1 Eigenvalues, eigenvectors, and eigenspaces

Define eigenvalues through the equation Av=λv, then recast the same idea as a null-space and determinant question so the structure becomes computable.

8.2Member

8.2 Diagonalization and similarity

Treat diagonalization as a basis change built from eigenvectors, then use similarity to explain when a matrix can be simplified without changing its essential eigenvalue data.

8.3Member

8.3 Characteristic polynomials and diagonalization tests

Use characteristic polynomials, algebraic and geometric multiplicity, and the distinct-eigenvalue test to decide when eigenvalue data is enough for diagonalization.

Chapter 9

Inner products and orthogonality

Inner products, orthogonality, orthonormal bases, and Gram-Schmidt as the geometric layer after eigenvalues.

9.1Member

9.1 Inner products, norms, and angles

Define the standard inner product and norm on R^m, then connect those formulas to length, angle, and the first structural inequalities.

9.2Member

9.2 Orthogonal sets and orthonormal bases

Use orthogonality to build orthogonal and orthonormal bases, then read coefficients without solving a linear system every time.

9.3Member

9.3 Gram-Schmidt orthogonalization

Apply Gram-Schmidt to turn a basis into an orthogonal or orthonormal basis while preserving the same span.

9.4Member

9.4 Cauchy-Schwarz and triangle inequalities

Study Cauchy-Schwarz and triangle inequalities as the two core estimates that control length, angle, and equality cases in inner-product spaces.