From e0f156c695332dd72d4876b2ab0089598bad74bd Mon Sep 17 00:00:00 2001 From: Guillaume Dalle <22795598+gdalle@users.noreply.github.com> Date: Tue, 25 Jun 2024 12:08:06 +0200 Subject: [PATCH] Cleanup before release (#149) --- CITATION.bib | 2 +- README.md | 4 ++-- examples/0_intro.jl | 6 +++--- examples/1_basic.jl | 2 +- examples/2_advanced.jl | 3 +-- examples/3_tricks.jl | 2 +- src/implicit_function.jl | 8 ++++---- 7 files changed, 13 insertions(+), 14 deletions(-) diff --git a/CITATION.bib b/CITATION.bib index f578e61..c37fad3 100644 --- a/CITATION.bib +++ b/CITATION.bib @@ -4,7 +4,7 @@ @misc{ImplicitDifferentiation.jl url = {https://github.com/gdalle/ImplicitDifferentiation.jl}, version = {v0.6.0}, year = {2024}, - month = {4} + month = {6} } @phdthesis{dalle:tel-04053322, diff --git a/README.md b/README.md index dc14675..2e77ca2 100644 --- a/README.md +++ b/README.md @@ -34,13 +34,13 @@ If you want a deeper dive into the theory, you can refer to the paper [_Efficien To install the stable version, open a Julia REPL and run: ```julia -julia> using Pkg; Pkg.add("ImplicitDifferentiation") +using Pkg; Pkg.add("ImplicitDifferentiation") ``` For the latest version, run this instead: ```julia -julia> using Pkg; Pkg.add(url="https://github.com/JuliaDecisionFocusedLearning/ImplicitDifferentiation.jl") +using Pkg; Pkg.add(url="https://github.com/JuliaDecisionFocusedLearning/ImplicitDifferentiation.jl") ``` Please read the [documentation](https://JuliaDecisionFocusedLearning.github.io/ImplicitDifferentiation.jl/stable/), especially the examples and FAQ. diff --git a/examples/0_intro.jl b/examples/0_intro.jl index 051f381..cbcacbc 100644 --- a/examples/0_intro.jl +++ b/examples/0_intro.jl @@ -81,9 +81,9 @@ We represent it using a type called [`ImplicitFunction`](@ref), which you will s =# #= -First we define a forward mapping corresponding to the function we consider. +First we define a `forward` mapping corresponding to the function we consider. It returns the actual output $y(x)$ of the function, and can be thought of as a black box solver. -Importantly, this Julia callable _doesn't need to be differentiable by automatic differentiation packages but the underlying function still needs to be mathematically differentiable_. +Importantly, this Julia callable doesn't need to be differentiable by automatic differentiation packages but the underlying function still needs to be mathematically differentiable. =# forward(x) = badsqrt(x); @@ -91,7 +91,7 @@ forward(x) = badsqrt(x); #= Then we define `conditions` $c(x, y) = 0$ that the output $y(x)$ is supposed to satisfy. These conditions must be array-valued, with the same size as $y$. -Unlike the forward mapping, _the conditions need to be differentiable by automatic differentiation packages_ with respect to both $x$ and $y$. +Unlike the forward mapping, the conditions need to be differentiable by automatic differentiation packages with respect to both $x$ and $y$. Here the conditions are very obvious: the square of the square root should be equal to the original value. =# diff --git a/examples/1_basic.jl b/examples/1_basic.jl index 10c62af..17e8c3e 100644 --- a/examples/1_basic.jl +++ b/examples/1_basic.jl @@ -86,7 +86,7 @@ ForwardDiff.jacobian(_x -> implicit_optim(_x; method=LBFGS()), x) @test ForwardDiff.jacobian(_x -> implicit_optim(_x; method=LBFGS()), x) ≈ J #src #= -In this instance, we could use ForwardDiff.jl directly on the solver, but it returns the wrong result (not sure why). +In this instance, we could use ForwardDiff.jl directly on the solver: =# ForwardDiff.jacobian(_x -> forward_optim(_x; method=LBFGS()), x) diff --git a/examples/2_advanced.jl b/examples/2_advanced.jl index 2309c27..abd0653 100644 --- a/examples/2_advanced.jl +++ b/examples/2_advanced.jl @@ -1,8 +1,7 @@ # # Advanced use cases #= -We dive into more advanced applications of implicit differentiation: -- constrained optimization problems +We dive into more advanced applications of implicit differentiation. =# using ForwardDiff diff --git a/examples/3_tricks.jl b/examples/3_tricks.jl index cc1c524..572b13b 100644 --- a/examples/3_tricks.jl +++ b/examples/3_tricks.jl @@ -28,7 +28,7 @@ function conditions_components_aux(a, b, m, d, e) return c_d, c_e end; -# You can use `ComponentVector` as an intermediate storage. +# You can use `ComponentVector` from [ComponentArrays.jl](https://github.com/jonniedie/ComponentArrays.jl) as an intermediate storage. function forward_components(x::ComponentVector) d, e = forward_components_aux(x.a, x.b, x.m) diff --git a/src/implicit_function.jl b/src/implicit_function.jl index 263bd61..51393cc 100644 --- a/src/implicit_function.jl +++ b/src/implicit_function.jl @@ -57,8 +57,8 @@ The value of `lazy` must be chosen together with the `linear_solver`, see below. - `forward`: a callable computing `y(x)`, does not need to be compatible with automatic differentiation - `conditions`: a callable computing `c(x, y)`, must be compatible with automatic differentiation - `linear_solver`: a callable to solve the linear system -- `conditions_x_backend`: defines how the conditions will be differentiated with respect to the first argument `x` -- `conditions_y_backend`: defines how the conditions will be differentiated with respect to the second argument `y` +- `conditions_x_backend`: how the conditions will be differentiated w.r.t. the first argument `x` +- `conditions_y_backend`: how the conditions will be differentiated w.r.t. the second argument `y` # Function signatures @@ -79,7 +79,7 @@ The byproduct `z` and the other positional arguments `args...` beyond `x` are co The provided `linear_solver` objects needs to be callable, with two methods: - `(A, b::AbstractVector) -> s::AbstractVector` such that `A * s = b` -- `(A, B::AbstractVector) -> S::AbstractMatrix` such that `A * S = B` +- `(A, B::AbstractMatrix) -> S::AbstractMatrix` such that `A * S = B` It can be either a direct solver (like `\\`) or an iterative one (like [`KrylovLinearSolver`](@ref)). Typically, direct solvers work best with dense Jacobians (`lazy = false`) while iterative solvers work best with operators (`lazy = true`). @@ -105,7 +105,7 @@ end forward, conditions; linear_solver=lazy ? KrylovLinearSolver() : \\, conditions_x_backend=nothing, - conditions_x_backend=nothing, + conditions_y_backend=nothing, ) Constructor for an [`ImplicitFunction`](@ref) which picks the `linear_solver` automatically based on the `lazy` parameter.