Skip to content

Commit

Permalink
Update the documentation
Browse files Browse the repository at this point in the history
Change Documenter version and add push_preview
  • Loading branch information
abelsiqueira committed Mar 14, 2021
1 parent d622258 commit b377aed
Show file tree
Hide file tree
Showing 9 changed files with 56 additions and 553 deletions.
8 changes: 6 additions & 2 deletions .github/workflows/TagBot.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,15 @@
name: TagBot
on:
schedule:
- cron: 0 * * * *
issue_comment:
types:
- created
workflow_dispatch:
jobs:
TagBot:
if: github.event_name == 'workflow_dispatch' || github.actor == 'JuliaTagBot'
runs-on: ubuntu-latest
steps:
- uses: JuliaRegistries/TagBot@v1
with:
token: ${{ secrets.GITHUB_TOKEN }}
ssh: ${{ secrets.DOCUMENTER_KEY }}
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"

[compat]
Documenter = "~0.25"
Documenter = "~0.26"
6 changes: 4 additions & 2 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -11,10 +11,12 @@ makedocs(
"Models" => "models.md",
"Guidelines" => "guidelines.md",
"Tools" => "tools.md",
"Tutorial" => "tutorial.md",
"API" => "api.md",
"Reference" => "reference.md"
]
)

deploydocs(repo = "github.com/JuliaSmoothOptimizers/NLPModels.jl.git")
deploydocs(
repo = "github.com/JuliaSmoothOptimizers/NLPModels.jl.git",
push_preview = true
)
9 changes: 0 additions & 9 deletions docs/src/api.md
Original file line number Diff line number Diff line change
Expand Up @@ -126,15 +126,6 @@ hess_op_residual
hess_op_residual!
```

## Derivative Checker

```@docs
gradient_check
jacobian_check
hessian_check
hessian_check_from_grad
```

## Internal

```@docs
Expand Down
42 changes: 2 additions & 40 deletions docs/src/guidelines.md
Original file line number Diff line number Diff line change
Expand Up @@ -132,43 +132,5 @@ Furthermore, the `show` method has to be updated with the correct direction of `

## [Advanced tests](@id advanced-tests)

To test your model, in addition to writing specific test functions, it is also advised to write consistency checks.
If your model can implement general problems, you can use the 6 problems in our `test/problems` folder implemented both as `ADNLPModel` and by explicitly defining these problem as models.
These can be used to verify that the implementation of your model is correct through the `consistent_nlps` function.
The simplest way to use these would be something like
```julia
for problem in ["BROWNDEN", "HS5", "HS6", "HS10", "HS11", "HS14"]
@printf("Checking problem %-20s", problem)
nlp_ad = eval(Meta.parse(lowercase(problem) * "_autodiff"))() # e.g. hs5_autodiff()
nlp_man = eval(Meta.parse(problem))() # e.g. HS5()
nlp_your = ...
nlps = [nlp_ad, nlp_man, nlp_your]
consistent_nlps(nlps)
end
```

Models with specific purposes can make use of the consistency checks by defining equivalent problems with `ADNLPModel` and testing them.
For instance, the following model is a regularization model defined by an existing model `inner`, a regularization parameter `ρ`, and a fixed point `z`:
```julia
mutable struct RegNLP <: AbstractNLPModel
meta :: NLPModelMeta
inner :: AbstractNLPModel
ρ
z
end
```
Assuming that all unconstrained functions are defined, the following tests will make sure that `RegNLP` is consistent with a specific `ADNLPModel`.
```julia
include(joinpath(dirname(pathof(NLPModels)), "..", "test", "consistency.jl"))

f(x) = (x[1] - 1)^2 + 100 * (x[2] - x[1]^2)^2
nlp = ADNLPModel(f, [-1.2; 1.0])
ρ = rand()
z = rand(2)
rnlp = RegNLP(nlp, ρ, z)
manual = ADNLPModel(x -> f(x) + ρ * norm(x - z)^2 / 2, [-1.2; 1.0])

consistent_nlps([rnlp, manual])
```
The complete example is available in the repository [RegularizationModel.jl](https://github.com/JuliaSmoothOptimizers/RegularizationModel.jl).

We have created the package [NLPModelsTest.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsTest.jl) which defines test functions and problems.
To make sure that your model is robust, we recommend using that package.
47 changes: 5 additions & 42 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,51 +66,14 @@ Install NLPModels.jl with the following command.
```julia
pkg> add NLPModels
```
This will enable a simple model and a model with automatic differentiation using
`ForwardDiff`. For models using JuMP see
[NLPModelsJuMP.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl).

This will enable the use of the API and the tools described here, and it allows the creation of a manually written model.
Look into [Models](@ref) for more information on that subject, and on a list of packages implementing ready-to-use models.

## Usage

See the [Models](@ref), the [Tools](@ref tools-section), the [Tutorial](@ref), or the [API](@ref).

## Internal Interfaces

- [`ADNLPModel`](@ref): Uses
[`ForwardDiff`](https://github.com/JuliaDiff/ForwardDiff.jl) to compute the
derivatives. It has a very simple interface, though it isn't very efficient
for larger problems.
- [`SlackModel`](@ref): Creates an equality constrained problem with bounds
on the variables using an existing NLPModel.
- [`LBFGSModel`](@ref): Creates a model using a LBFGS approximation to
the Hessian using an existing NLPModel.
- [`LSR1Model`](@ref): Creates a model using a LSR1 approximation to
the Hessian using an existing NLPModel.
- [`ADNLSModel`](@ref): Similar to `ADNLPModel`, but for nonlinear
least squares.
- [`FeasibilityResidual`](@ref): Creates a nonlinear least squares
model from an equality constrained problem in which the residual
function is the constraints function.
- [`LLSModel`](@ref): Creates a linear least squares model.
- [`SlackNLSModel`](@ref): Creates an equality constrained nonlinear least squares
problem with bounds on the variables using an existing NLSModel.
- [`FeasibilityFormNLS`](@ref): Creates residual variables and constraints, so that the residual
is linear.

## External Interfaces

- `AmplModel`: Defined in
[`AmplNLReader.jl`](https://github.com/JuliaSmoothOptimizers/AmplNLReader.jl)
for problems modeled using [AMPL](https://ampl.com)
- `CUTEstModel`: Defined in
[`CUTEst.jl`](https://github.com/JuliaSmoothOptimizers/CUTEst.jl) for
problems from [CUTEst](https://github.com/ralna/CUTEst/wiki).
- [`MathOptNLPModel`](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl) and [`MathOptNLSModel`](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl)
for problems modeled using [JuMP.jl](https://github.com/jump-dev/JuMP.jl) and [MathOptInterface.jl](https://github.com/jump-dev/MathOptInterface.jl).

If you want your interface here, open a PR.

If you want to create your own interface, check these [Guidelines](@ref).
See the [Models](@ref), the [Tools](@ref tools-section), or the [API](@ref).


## Attributes

Expand Down
139 changes: 24 additions & 115 deletions docs/src/models.md
Original file line number Diff line number Diff line change
@@ -1,117 +1,26 @@
# Models

The following general models are implemented in this package:
- [ADNLPModel](@ref)
- [Derived Models](@ref)
- [SlackModel](@ref)
- [LBFGSModel](@ref)
- [LSR1Model](@ref)

In addition, the following nonlinear least squares models are
implemented in this package:
- [ADNLSModel](@ref)
- [FeasibilityResidual](@ref)
- [LLSModel](@ref)
- [SlackNLSModel](@ref)
- [FeasibilityFormNLS](@ref)

There are other external models implemented. In particular,
- [AmplModel](https://github.com/JuliaSmoothOptimizers/AmplNLReader.jl)
- [CUTEstModel](https://github.com/JuliaSmoothOptimizers/CUTEst.jl)
- [MathOptNLPModel](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl) and [MathOptNLSModel](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl)
using `JuMP/MOI`.

There are currently two models implemented in this package, besides the
external ones.

# NLPModels

## ADNLPModel

```@docs
NLPModels.ADNLPModel
```

### Example

```@example
using NLPModels
f(x) = sum(x.^4)
x = [1.0; 0.5; 0.25; 0.125]
nlp = ADNLPModel(f, x)
grad(nlp, x)
```

## Derived Models

The following models are created from any given model, making some
modification to that model.

### SlackModel

```@docs
NLPModels.SlackModel
```

### Example

```@example
using NLPModels
f(x) = x[1]^2 + 4x[2]^2
c(x) = [x[1]*x[2] - 1]
x = [2.0; 2.0]
nlp = ADNLPModel(f, x, c, [0.0], [0.0])
nlp_slack = SlackModel(nlp)
nlp_slack.meta.lvar
```

### LBFGSModel

```@docs
NLPModels.LBFGSModel
```

### LSR1Model

```@docs
NLPModels.LSR1Model
```

# NLSModels

## ADNLSModel

```@docs
NLPModels.ADNLSModel
```

```@example
using NLPModels
F(x) = [x[1] - 1; 10*(x[2] - x[1]^2)]
nlp = ADNLSModel(F, [-1.2; 1.0], 2)
residual(nlp, nlp.meta.x0)
```

## FeasibilityResidual

```@docs
NLPModels.FeasibilityResidual
```

## LLSModel

```@docs
NLPModels.LLSModel
```

## SlackNLSModel

```@docs
NLPModels.SlackNLSModel
```

## FeasibilityFormNLS

```@docs
NLPModels.FeasibilityFormNLS
```
The following is a list of packages implement the NLPModels API.

If you want your package listed here, open a Pull Request.

If you want to create your own interface, check these [Guidelines](@ref).
## Packages

- [NLPModelsModifiers.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsModifiers.jl):
Models that modify existing models.
For instance, creating slack variables, or moving constraints into the objective functions, or using Quasi-Newton LBFSG approximations to the Hessian.
- [ADNLPModels.jl](https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl):
Models with automatic differentiation. It has a very simple interface, although it isn't very efficient for larger problems.
- [CUTEst.jl](https://github.com/JuliaSmoothOptimizers/CUTEst.jl):
For problems from [CUTEst](https://github.com/ralna/CUTEst/wiki).
- [AmplNLReader.jl](https://github.com/JuliaSmoothOptimizers/AmplNLReader.jl):
For problems modeled using [AMPL](https://ampl.com)
- [NLPModelsJuMP.jl](https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl):
For problems modeled using [JuMP.jl](https://github.com/jump-dev/JuMP.jl).
- [QuadraticModels.jl](https://github.com/JuliaSmoothOptimizers/QuadraticModels.jl):
For problems with quadratic and linear structure.
- [LLSModels.jl](https://github.com/JuliaSmoothOptimizers/LLSModels.jl):
Creates a linear least squares model.
- [PDENLPModels.jl](https://github.com/JuliaSmoothOptimizers/PDENLPModels.jl):
For PDE problems.
26 changes: 14 additions & 12 deletions docs/src/tools.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,12 +7,13 @@ number of times that function was called is stored inside the
`NLPModel`. For instance

```@example
using NLPModels, LinearAlgebra
nlp = ADNLPModel(x -> dot(x, x), zeros(2))
for i = 1:100
obj(nlp, rand(2))
end
neval_obj(nlp)
# TODO: Reenable this example
# using NLPModels, ADNLPModels, LinearAlgebra
# nlp = ADNLPModel(x -> dot(x, x), zeros(2))
# for i = 1:100
# obj(nlp, rand(2))
# end
# neval_obj(nlp)
```

Some counters are available for all models, some are specific. In
Expand Down Expand Up @@ -44,23 +45,24 @@ To get the sum of all counters called for a problem, use
[`sum_counters`](@ref).

```@example
using NLPModels, LinearAlgebra
nlp = ADNLPModel(x -> dot(x, x), zeros(2))
obj(nlp, rand(2))
grad(nlp, rand(2))
sum_counters(nlp)
# TODO: Reenable this example
# using NLPModels, LinearAlgebra
# nlp = ADNLPModel(x -> dot(x, x), zeros(2))
# obj(nlp, rand(2))
# grad(nlp, rand(2))
# sum_counters(nlp)
```

## Querying problem type

There are some variable for querying the problem type:

- [`has_bounds`](@ref): True when not all variables are free.
- [`bound_constrained`](@ref): True for problems with bounded variables
and no other constraints.
- [`equality_constrained`](@ref): True when problem is constrained only
by equalities.
- [`has_equalities`](@ref): True when problem has at least one equality constraint.
- [`has_bounds`](@ref): True when not all variables are free.
- [`inequality_constrained`](@ref): True when problem is constrained
by inequalities.
- [`has_inequalities`](@ref): True when problem has at least one inequality constraint that isn't a bound.
Expand Down
Loading

0 comments on commit b377aed

Please sign in to comment.