Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug with right-hand side #194

Closed
tmigot opened this issue Aug 21, 2024 · 1 comment
Closed

Bug with right-hand side #194

tmigot opened this issue Aug 21, 2024 · 1 comment

Comments

@tmigot
Copy link
Member

tmigot commented Aug 21, 2024

I am trying to update OptimizationProblems.jl with the new JuMP interface, and run into the following issue for some problems:

using ADNLPModels, JuMP, NLPModels, NLPModelsJuMP

default_nvar = 100
T = Float64

function ju_hs100(args...; kwargs...)
  nlp = Model()
  x0 = [1, 2, 0, 4, 0, 1, 1]
  @variable(nlp, x[i = 1:7], start = x0[i])

  @constraint(nlp, 282 - 7 * x[1] - 3 * x[2] - 10 * x[3]^2 - x[4] + x[5] ≥ 0)
  @constraint(nlp, 196 - 23 * x[1] - x[2]^2 - 6 * x[6]^2 + 8 * x[7] ≥ 0)
  @constraint(nlp, 127 - 2 * x[1]^2 - 3 * x[2]^4 - x[3] - 4 * x[4]^2 - 5 * x[5] ≥ 0)
  @constraint(nlp, -4 * x[1]^2 - x[2]^2 + 3 * x[1] * x[2] - 2 * x[3]^2 - 5 * x[6] + 11 * x[7] ≥ 0)

  @objective(
    nlp,
    Min,
    (x[1] - 10)^2 +
    5 * (x[2] - 12)^2 +
    x[3]^4 +
    3 * (x[4] - 11)^2 +
    10 * x[5]^6 +
    7 * x[6]^2 +
    x[7]^4 - 4 * x[6] * x[7] - 10 * x[6] - 8 * x[7]
  )

  return nlp
end

function ad_hs100(; n::Int = default_nvar, type::Type{T} = Float64, kwargs...) where {T}
  function f(x)
    n = length(x)
    return (x[1] - 10)^2 +
           5 * (x[2] - 12)^2 +
           x[3]^4 +
           3 * (x[4] - 11)^2 +
           10 * x[5]^6 +
           7 * x[6]^2 +
           x[7]^4 - 4 * x[6] * x[7] - 10 * x[6] - 8 * x[7]
  end
  x0 = T[1, 2, 0, 4, 0, 1, 1]
  function c!(cx, x)
    cx[1] = 282 - 7 * x[1] - 3 * x[2] - 10 * x[3]^2 - x[4] + x[5]
    cx[2] = 196 - 23 * x[1] - x[2]^2 - 6 * x[6]^2 + 8 * x[7]
    cx[3] = 127 - 2 * x[1]^2 - 3 * x[2]^4 - x[3] - 4 * x[4]^2 - 5 * x[5]
    cx[4] = -4 * x[1]^2 - x[2]^2 + 3 * x[1] * x[2] - 2 * x[3]^2 - 5 * x[6] + 11 * x[7]
    return cx
  end
  lcon = T[-282.0, -196.0, 0.0, 0.0]
  ucon = T(Inf) * ones(T, 4)
  return ADNLPModels.ADNLPModel!(f, x0, c!, lcon, ucon, name = "hs100"; kwargs...)
end

nlp_jump = MathOptNLPModel(ju_hs100())
nlp_ad = ad_hs100()

using Test, LinearAlgebra
@test nlp_jump.meta.lcon == nlp_ad.meta.lcon
@test nlp_jump.meta.ucon == nlp_ad.meta.ucon
x1 = rand(7)
@test norm(cons(nlp_jump, x1) - cons(nlp_ad, x1), Inf) # doesn't work

#=
julia> cons(nlp_jump, x1)
4-element Vector{Float64}:
 -11.9220041710068
  -0.969071724901412
   3.424652495769024
 119.9921154237291

julia> cons(nlp_ad, x1)
4-element Vector{Float64}:
 270.0779958289932
 195.03092827509857
 119.9921154237291
   3.4246524957690236
=#

Any idea @amontoison ?

@tmigot
Copy link
Member Author

tmigot commented Aug 21, 2024

Sorry, not a good question. JuMP is doing some permutations with the order of constraints that I am not 100% getting...

@tmigot tmigot closed this as completed Aug 21, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant