Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider using multi objective from Jump for NLS #165

Open
abelsiqueira opened this issue Oct 5, 2023 · 4 comments
Open

Consider using multi objective from Jump for NLS #165

abelsiqueira opened this issue Oct 5, 2023 · 4 comments

Comments

@abelsiqueira
Copy link
Member

See package MultiObjectiveAlgorithms.jl

@amontoison
Copy link
Member

We discussed about that during JuMP-dev 2024 with @blegat.

@blegat
Copy link
Contributor

blegat commented Jul 22, 2024

We converged to the following suggestion.
Make the F argument here optional

function MathOptNLSModel(cmodel::JuMP.Model, F; hessian::Bool = true, name::String = "Generic")

and the MOI wrapper won't set this F.
If this argument F is not given, it can be recovered from the objective if it is a ScalarNonlinearFunction if the root node is + and the node of each children is ^ with second argument being 2.
If the objective is not nonlinear or if it's not of that form then a nice error message will explain what's the issue and recommend using JuMP.@force_nonlinear.
See examples below:

julia> @objective(model, Min, (x^2 + 1)^2 + (x^3 + 1)^2)
((x² + 1) ^ 2.0) + (((x ^ 3) + 1.0) ^ 2.0)

julia> @objective(model, Min, (x + 1)^2 + (x + 1)^2) # Not what we want
2+ 4 x + 2

julia> @objective(model, Min, @force_nonlinear((x + 1)^2 + (x + 1)^2)) # `@force_nonlinear` saves the day
((x + 1) ^ 2) + ((x + 1) ^ 2)

julia> y = [x + 1, x - 1]
2-element Vector{AffExpr}:
 x + 1
 x - 1

julia> @objective(model, Min, sum(y[i]^2 for i in eachindex(y))) # Note what we want
2+ 0 x + 2

julia> @objective(model, Min, sum(@force_nonlinear(y[i]^2) for i in eachindex(y))) # `@force_nonlinear` saves the day
((x + 1) ^ 2) + ((x - 1) ^ 2)

What do you think ?

@blegat
Copy link
Contributor

blegat commented Jul 22, 2024

The issue of multiple objective is that this wouldn't solver-independent: If you want to compare with a solver that's not least-square, you will need to use the sum of squares and if you want to use a least square solver you need a multiple objective.
With the suggestion from the comment above, the same model can be used for both.

@amontoison
Copy link
Member

I like your idea @blegat, it will be easy recover the term F(x) from ||F(x)||_2.
It's what we want for RipQP.jl and CaNNOLeS.jl.

For example, one special application of RipQP is constrained linear least-squares problems and we want to recover the terms A, B, b, c to internally exploit the structure and solve a more relevant optimization problem:

image

If MOI pre-digests the objective, we will only be able to recover A'A and A'b, which is not what we want, and it could be dense if only one column of the sparse matrix A is dense!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants