# Geometry Formulary ## Inverse of a Matrix $A^{-1} = \frac{1}{det(A)} Adj(A)$ ## Adjugate Matrix The adjugate of a matrix $A$ is the `transpose` of the `cofactor matrix`:\ $Adj(A) = C^{T}$ ### $(i-j)$-minor (AKA $M_{ij}$) $M_{ij}$ := Determinant of the matrix $B$ got by removing the ***$i$-row*** and the ***$j$-column*** from matrix $A$ ### Cofactors $C$ is the matrix of `cofactors` of a matrix $A$ where all the elements $c_{ij}$ are defined like this:\ $ c_{ij} = \left( -1\right)^{i + j}M_{ij} $ ## Eigenvalues By starting from the definition of `eigenvectors`:\ $A\vec{v} = \lambda\vec{v}$ As we can see, the vector $\vec{v}$ was unaffected by this matrix multiplication appart from a scaling factor $\lambda$, called `eigenvalue` By rewriting this formula we get:\ $ \left(A - \lambda I\right)\vec{v} = 0 $ This is solved for:\ $\det(A- \lambda I) = 0$ > [!NOTE] > If the determinant is 0, then $(A - \lambda I )$ is not invertible, so > we can't solve the previous equation by using the trivial solution (which > can't be taken into account since $\vec{v}$ is not $0$ by definition) ## Caley-Hamilton Each square matrix over a `commutative ring` satisfies its own characteristic equation $det(\lambda I - A)$ > [!TIP] > In other words, once found the characteristic equation, we can > substitute the ***unknown*** variable $\lambda$ with the matrix itself > (***known***), > powered to the correspondent power