-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with new JuMP operator for problem HS87 #195
Comments
|
We need to get the list of |
Thanks @blegat , so to clarify we should add something like this
in the appropriate place, is that correct? |
I'm wondering if we don't have a bug in MOI: using JuMP, MathOptInterface
function hs87(args...; kwargs...)
nlp = Model()
x0 = [390, 1000, 419.5, 340.5, 198.175, 0.5]
lvar = [0, 0, 340, 340, -1000, 0]
uvar = [400, 1000, 420, 420, 10000, 0.5236]
@variable(nlp, lvar[i] <= x[i = 1:6] <= uvar[i], start = x0[i])
a = 131078 // 1000
b = 148577 // 100000
ci = 90798 // 100000
d = cos(147588 // 100000)
e = sin(147588 // 100000)
@constraint(nlp, 300 - x[1] - 1 / a * x[3] * x[4] * cos(b - x[6]) + ci / a * d * x[3] == 0)
@constraint(nlp, -x[2] - 1 / a * x[3] * x[4] * cos(b + x[6]) + ci / a * d * x[4]^2 == 0)
@constraint(nlp, -x[5] - 1 / a * x[3] * x[4] * cos(b + x[6]) + ci / a * e * x[4]^2 == 0)
@constraint(nlp, 200 - 1 / a * x[3] * x[4] * sin(b - x[6]) + ci / a * e * x[3]^2 == 0)
function f1(t)
return if 0 <= t <= 300
30 * t
elseif 300 <= t <= 400
31 * t
else
eltype(x)(Inf)
end
end
function f2(t)
return if 0 <= t <= 100
28 * t
elseif 100 <= t <= 200
29 * t
elseif 200 <= t <= 1000
30 * t
else
eltype(t)(Inf)
end
end
@operator(nlp, op_f1, 1, f1)
@expression(nlp, op_f1)
@operator(nlp, op_f2, 1, f2)
@expression(nlp, op_f2)
@objective(nlp, Min, op_f1(x[1]) + op_f2(x[2]))
return nlp
end
nlp = hs87()
moi_backend = backend(nlp)
MOI.get(moi_backend, MOI.ListOfSupportedNonlinearOperators()) ERROR: MathOptInterface.GetAttributeNotAllowed{MathOptInterface.ListOfSupportedNonlinearOperators}: Getting attribute MathOptInterface.ListOfSupportedNonlinearOperators() cannot be performed: Cannot query MathOptInterface.ListOfSupportedNonlinearOperators() from `Utilities.CachingOptimizer` because no optimizer is attached (the state is `NO_OPTIMIZER`). You may want to use a `CachingOptimizer` in `AUTOMATIC` mode or you may need to call `reset_optimizer` before doing this operation if the `CachingOptimizer` is in `MANUAL` mode. julia> moi_backend.mode
AUTOMATIC::CachingOptimizerMode = 1 |
The sentence |
@odow May I ask if I'm doing something wrong in the code snippet above? #195 (comment) |
We can't get the list of supported operators because you haven't selected a solver |
We could perhaps return something meaningful for this case where you have a CachingOptimizer with no optimizer attached. |
Do you have a trick for us so that we can extract the operators and expect that they are supported? Benoît implemented an optimizer but all JuMP models in OptimizationProblems.jl don't have an optimizer attached. |
That code is wrong. It returns a You need to implement support for It might be faster if I just take a look and make a PR 😄 |
This should be enough to point you in the right direction: #197 (comment) I didn't test other than the hs87 example. |
Thanks a lot @odow!!! 😃 |
I got the following issue when updating OptimizationProblems.jl to more recent versions of JuMP:
which return the following error
any idea?
The text was updated successfully, but these errors were encountered: