using Symbolics , LinearAlgebra , PlutoUI , ForwardDiff , Zygote
TableOfContents(title="Differentiating the Determinant and the Inverse", indent=true, depth=4, aside=true)
Math: The gradient of the determinant
md"""
# Math: The gradient of the determinant
"""
Theorem:
Note:the adjugate is defined as
md"""
Theorem:
``\nabla(\det A) = \text{cofactor}(A) = (\det A)A^{-T} = \text{adj}(A^T)`` $(br)
``d(\det A) = \text{tr}( \det(A)A^{-1}dA) = \text{tr(adj}(A) dA) = \text{tr(cofactor}(A)^T dA)`` $(br)
Note:the adjugate is defined as ``(\det A)A^{-1}``; it is the transpose of the cofactor.
"""
adj (generic function with 1 method)
adj(A) = det(A) * inv(A) # adjugate function
cofactor (generic function with 1 method)
cofactor(A) = adj(A)'
Formulas at a glance:
adj(
cofactor(
md"""
**Formulas at a glance:** $(br)
``A^{-1}`` = adj(``A``)/det(``A``) = cofactor(``A``)``^T``/det(``A``) $(br)
adj(``A``) = det(``A``)``A^{-1}`` = cofactor(``A``)``^T`` $(br)
cofactor(``A``) = ``A^{-T}``/det(``A``) = adj(``A``)``^{T}``
"""
The adjugate is the transpose of the cofactor matrix. You may remember that the cofactor matrix has as (i,j) entry
md"""
The adjugate is the transpose of the cofactor matrix.
You may remember that the cofactor matrix has as (i,j) entry ``(-1)^{i+j}`` times the determinant obtained by deleting row i and column j.
"""
@variables a b c d
M = [ a c;b d]
simplify.(cofactor(M))
simplify.(adj( M))
simplify.(inv(M))
Numerical Demonstration
md"""
## Numerical Demonstration
"""
5×5 Matrix{Float64}:
6.23738e-6 9.80317e-6 9.49748e-6 2.16775e-6 7.13224e-6
4.29949e-7 1.04401e-6 9.43453e-6 3.49108e-6 3.30058e-6
2.43275e-6 6.93099e-6 9.77647e-6 4.73139e-6 1.52485e-6
4.08123e-6 5.79363e-6 5.63512e-6 9.05209e-6 5.3378e-6
3.20542e-7 9.13855e-6 1.20064e-6 2.504e-6 3.50599e-6
A = rand(5,5); dA = rand(5,5) * .00001
-1.88723e-6
-1.88722e-6
det(A+dA) - det(A), tr( adj(A) * dA )
Forward Diff Autodiff
md"""
## Forward Diff Autodiff
"""
5×5 Matrix{Float64}:
0.227566 -0.231878 0.00127616 0.0537115 -0.113963
-0.441303 0.453903 0.109316 -0.188827 0.132947
0.0485198 -0.0217404 -0.0335765 0.068198 -0.0835962
-0.00220219 -0.0833639 0.0166832 0.0266121 -0.0155698
0.124369 -0.124389 -0.105723 0.00347312 0.0503901
ForwardDiff.gradient(A->det(A), A)
5×5 Matrix{Float64}:
0.227566 -0.231878 0.00127616 0.0537115 -0.113963
-0.441303 0.453903 0.109316 -0.188827 0.132947
0.0485198 -0.0217404 -0.0335765 0.068198 -0.0835962
-0.00220219 -0.0833639 0.0166832 0.0266121 -0.0155698
0.124369 -0.124389 -0.105723 0.00347312 0.0503901
adj(A')
Reverse Mode Autodiff
md"""
## Reverse Mode Autodiff
"""
Zygote does reverse ad
md"""
Zygote does reverse ad
"""