Preparing for your next Quant Interview?
Practice Here!
OpenQuant
Section 4 of 6
Linear AlgebraMatrix Algebra

Matrix Basics

Fundamental Knowledge

Let AA and BB be square n×nn \times n matrices. Then all of the following hold:

cos(θ)=xyxy(AB)=BA(AB)1=B1A1A1A=AA1=Irank(A)+null(A)=n\cos(\theta) = \frac{x^\intercal y}{\|x\| \|y\|} \quad (AB)^\intercal = B^\intercal A^\intercal \quad (AB)^{-1} = B^{-1} A^{-1} \quad A^{-1}A = AA^{-1} = I \quad \text{rank}(A) + \text{null}(A) = n
Av=λv    (AλI)v=0    det(AλI)=0det(A)=1det(A1)det(A)=det(A)Av = \lambda v \implies (A - \lambda I)v = 0 \implies \det(A - \lambda I) = 0 \quad \det(A) = \frac{1}{\det(A^{-1})} \quad \det(A) = \det(A^\intercal)
det(AB)=det(A)det(B)det(cA)=cndet(A)det(A)=i=1nλitrace(A)=i=1nAii=i=1nλi\det(AB) = \det(A)\det(B) \quad \det(cA) = c^n\det(A) \quad \det(A) = \prod_{i=1}^n \lambda_i \quad \text{trace}(A) = \sum_{i=1}^n A_{ii} = \sum_{i=1}^n \lambda_i

Nonsingular Matrices

A nonsingular matrix is invertible. AA (n×nn \times n) is nonsingular if and only if any (and therefore all) of the following hold:

  1. Columns of AA span Rn\mathbb{R}^n, or equivalently, rank(A)=dim(range(A))=n\text{rank}(A) = \text{dim}(\text{range}(A)) = n
  2. AA^\intercal is nonsingular
  3. det(A)0\det(A) \neq 0
  4. Ax=0Ax = 0 has only the trivial solution x=0x = 0; dim(nul(A))=0\text{dim}(\text{nul}(A)) = 0

Note that if A=[abcd]A = \begin{bmatrix} a & b \\ c & d \end{bmatrix}, then A1=1det(A)[dbca]A^{-1} = \frac{1}{\det(A)} \begin{bmatrix} d & -b \\ -c & a \end{bmatrix}. Larger inverses may be found via Gauss-Jordan Elimination: [AI]elementary row operations[IA1][A \mid I] \xrightarrow{\text{elementary row operations}} [I \mid A^{-1}]

2D Rotation Matrices

2D Rotation matrices by θ\theta radians counter-clockwise about the origin are matrices in the form Rθ=[cosθsinθsinθcosθ]R_\theta = \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix}.

Orthogonal Matrices

Orthogonal matrices (unitary matrices in the reals) are square with orthonormal row and column vectors. They are nonsingular and satisfy Q=Q1Q^\intercal = Q^{-1}. Orthogonal matrices can be interpreted as rotation matrices.

Idempotent Matrices

Idempotent matrices are square matrices satisfying A2=AA^2 = A. In other words, the effect of applying the linear transformation AA twice is the same as applying it once. Projection matrices are Idempotent.

Positive Semi-definite Matrices

Covariance and correlation matrices are always positive semi-definite and positive definite if there is no perfect linear dependence among random variables. Each of the following conditions is a necessary and sufficient condition for AA to be positive semi-definite/definite:

Positive Semi-DefinitePositive Definite
zAz0z^\intercal Az \ge 0 for all column vectors zzzAz>0z^\intercal Az > 0 for all nonzero column vectors zz
All eigenvalues are nonnegativeAll eigenvalues are positive
All upper left/lower right submatrices have nonnegative determinantsAll upper left/lower right submatrices have positive determinants

Note that if AA has negative diagonal elements, then AA cannot be positive semi-definite.

Linear Algebra

Quantitative Researcher
Quantitative Trader
Table of Contents