Math: The gradient of the determinant
Theorem:
Note:the adjugate is defined as
adj (generic function with 1 method)
cofactor (generic function with 1 method)
Formulas at a glance:
adj(
cofactor(
The adjugate is the transpose of the cofactor matrix. You may remember that the cofactor matrix has as (i,j) entry
Numerical Demonstration
5×5 Matrix{Float64}:
6.23738e-6 9.80317e-6 9.49748e-6 2.16775e-6 7.13224e-6
4.29949e-7 1.04401e-6 9.43453e-6 3.49108e-6 3.30058e-6
2.43275e-6 6.93099e-6 9.77647e-6 4.73139e-6 1.52485e-6
4.08123e-6 5.79363e-6 5.63512e-6 9.05209e-6 5.3378e-6
3.20542e-7 9.13855e-6 1.20064e-6 2.504e-6 3.50599e-6
-1.88723e-6
-1.88722e-6
Forward Diff Autodiff
5×5 Matrix{Float64}:
0.227566 -0.231878 0.00127616 0.0537115 -0.113963
-0.441303 0.453903 0.109316 -0.188827 0.132947
0.0485198 -0.0217404 -0.0335765 0.068198 -0.0835962
-0.00220219 -0.0833639 0.0166832 0.0266121 -0.0155698
0.124369 -0.124389 -0.105723 0.00347312 0.0503901
5×5 Matrix{Float64}:
0.227566 -0.231878 0.00127616 0.0537115 -0.113963
-0.441303 0.453903 0.109316 -0.188827 0.132947
0.0485198 -0.0217404 -0.0335765 0.068198 -0.0835962
-0.00220219 -0.0833639 0.0166832 0.0266121 -0.0155698
0.124369 -0.124389 -0.105723 0.00347312 0.0503901
Reverse Mode Autodiff
Zygote does reverse ad
5×5 Matrix{Float64}: 0.227566 -0.231878 0.00127616 0.0537115 -0.113963 -0.441303 0.453903 0.109316 -0.188827 0.132947 0.0485198 -0.0217404 -0.0335765 0.068198 -0.0835962 -0.00220219 -0.0833639 0.0166832 0.0266121 -0.0155698 0.124369 -0.124389 -0.105723 0.00347312 0.0503901
Symbolic Demonstration
Direct Proof
A direct proof where you just differentiate the scalar with respect to every input can be obtained as a simple consequence from the cofactor expansion a.k.a. the Laplace expansion of the determinant based on the
Recall that the determinant is a linear (affine) function of any element with slope equal to the cofactor.
We then readily obtain
Example:
Fancy Proof
Figure out linearization near the identity I
det(I+dA) = trace(dA) (think of the n! terms in the determinant and drop higher terms)
Application to derivative of the characteristic polynomial
Direct derivative (freshman calculus):
Perfectly good simple proof, but if you want to show off...
With our new technology:
Note:
Check:
-0.649343
-0.649343
Application d(log(det(A)))
The logarithmic derivative shows up a lot in applied mathematics. For example the key term in Newton's method:
Math: The Jacobian of the Inverse
true