-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Quadratic constraints #102
Quadratic constraints #102
Conversation
536e263
to
4d4f2bd
Compare
Codecov Report
@@ Coverage Diff @@
## main #102 +/- ##
==========================================
+ Coverage 95.00% 95.07% +0.07%
==========================================
Files 3 3
Lines 580 690 +110
==========================================
+ Hits 551 656 +105
- Misses 29 34 +5
Continue to review full report at Codecov.
|
Hi @amontoison , I made some progress with this one. A list of things that we can discuss:
Right now my test was very basic, just testing that all the API returns something.
|
We can also use the following script to test problems with quadratic constraints from OptimizationProblems.jl
|
@amontoison I am checking with the other problems in OptimizationProblems.jl, but it looks good to me. |
I am done testing the OptimizationProblems.jl ; I added 45 problems with quadratic constraints https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/tree/add-quadratic and no error found! So, it's good for me. Up to you. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Major modifications of NLPModels.hess_coord!
are required.
for j=1:length(qcon.vec) | ||
vals[k + j] = qcon.b[qcon.vec[j]] | ||
end | ||
nnzj = length(qcon.hessian.vals) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nnzj
should be the number of nnz
of Qᵢx + bᵢ
(length(set)) and not Qᵢ
.
for i = 1:(nlp.quadcon.nquad) | ||
qcon = nlp.quadcon[i] | ||
nnzh = length(qcon.hessian.vals) | ||
vals[(k + 1):(k + nnzh)] .= qcon.hessian.vals .* y[nlp.meta.nlin + i] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We want to avoid that with the hessian_quad
function!
If we have 10 quadratic constraints Q_i with same structure, you will store 10 times the required number of nnz for nothing.
@@ -354,6 +394,7 @@ function NLPModels.hess_coord!( | |||
vals[(nlp.obj.nnzh + 1):(nlp.meta.nnzh)] .= 0.0 | |||
end | |||
if nlp.obj.type == "NONLINEAR" | |||
vals .= 0.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MOI.eval_hessian_lagrangian
doesn't overwrite vals
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If vals .= 0.0
is required, we should also add it in the other NLPModels.hess_coord!
function.
8cbdbb6
to
e141a9c
Compare
Superseded by #181 |
@tmigot