|
| 1 | +# SciMLJacobianOperators.jl |
| 2 | + |
| 3 | +SciMLJacobianOperators provides a convenient way to compute Jacobian-Vector Product (JVP) |
| 4 | +and Vector-Jacobian Product (VJP) using |
| 5 | +[SciMLOperators.jl](https://github.com/SciML/SciMLOperators.jl) and |
| 6 | +[DifferentiationInterface.jl](https://github.com/gdalle/DifferentiationInterface.jl). |
| 7 | + |
| 8 | +Currently we have interfaces for: |
| 9 | + |
| 10 | + - `NonlinearProblem` |
| 11 | + - `NonlinearLeastSquaresProblem` |
| 12 | + |
| 13 | +and all autodiff backends supported by DifferentiationInterface.jl are supported. |
| 14 | + |
| 15 | +## Example |
| 16 | + |
| 17 | +```julia |
| 18 | +using SciMLJacobianOperators, NonlinearSolve, Enzyme, ForwardDiff |
| 19 | + |
| 20 | +# Define the problem |
| 21 | +f(u, p) = u .* u .- p |
| 22 | +u0 = ones(4) |
| 23 | +p = 2.0 |
| 24 | +prob = NonlinearProblem(f, u0, p) |
| 25 | +fu0 = f(u0, p) |
| 26 | +v = ones(4) .* 2 |
| 27 | + |
| 28 | +# Construct the operator |
| 29 | +jac_op = JacobianOperator( |
| 30 | + prob, fu0, u0; |
| 31 | + jvp_autodiff = AutoForwardDiff(), |
| 32 | + vjp_autodiff = AutoEnzyme(; mode = Enzyme.Reverse) |
| 33 | +) |
| 34 | +sjac_op = StatefulJacobianOperator(jac_op, u0, p) |
| 35 | + |
| 36 | +sjac_op * v # Computes the JVP |
| 37 | +# 4-element Vector{Float64}: |
| 38 | +# 4.0 |
| 39 | +# 4.0 |
| 40 | +# 4.0 |
| 41 | +# 4.0 |
| 42 | + |
| 43 | +sjac_op' * v # Computes the VJP |
| 44 | +# 4-element Vector{Float64}: |
| 45 | +# 4.0 |
| 46 | +# 4.0 |
| 47 | +# 4.0 |
| 48 | +# 4.0 |
| 49 | + |
| 50 | +# What if we multiply the VJP and JVP? |
| 51 | +snormal_form = sjac_op' * sjac_op |
| 52 | + |
| 53 | +snormal_form * v # Computes JᵀJ * v |
| 54 | +# 4-element Vector{Float64}: |
| 55 | +# 8.0 |
| 56 | +# 8.0 |
| 57 | +# 8.0 |
| 58 | +# 8.0 |
| 59 | +``` |
0 commit comments