-
Notifications
You must be signed in to change notification settings - Fork 3
added: derivatives in getinfo dictionnary for NonLinMPC and MovingHorizonEstimator
#290
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Quick question @odow. If I want to retrieve the value of the edit: also, for Ipopt, why both functions return a vector with |
|
To be a bit more explicit, here's a simple example, that is, the first example of jump-dev/Ipopt.jl#468 (comment): using JuMP, Ipopt, MathOptInterface
set = MathOptInterface.VectorNonlinearOracle(;
dimension = 2,
l = [-Inf],
u = [1.0],
eval_f = (ret, x) -> (ret[1] = x[1]^2 + x[2]^2),
jacobian_structure = [(1, 1), (1, 2)],
eval_jacobian = (ret, x) -> ret .= 2.0 .* x,
hessian_lagrangian_structure = [(1, 1), (2, 2)],
eval_hessian_lagrangian = (ret, x, u) -> ret .= 2.0 .* u[1],
)
model = Model(Ipopt.Optimizer)
set_silent(model)
@variable(model, x)
@variable(model, y)
@objective(model, Max, x + y)
@constraint(model, c, [x, y] in set)
optimize!(model)
@show value(x), value(y)
@show dual(c)why does the dual variable vector has two elements instead of one: Isn't supposed to be the same thing as the Lagrange multiplier (the Thanks. |
|
I asked the question on discourse, since it may be usefull for others. You can answer me here: https://discourse.julialang.org/t/lagrange-multipliers-of-a-vectornonlinearoracle-after-solve/134236 |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #290 +/- ##
==========================================
+ Coverage 98.50% 98.54% +0.03%
==========================================
Files 28 28
Lines 4831 4964 +133
==========================================
+ Hits 4759 4892 +133
Misses 72 72 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
The sparsity structure is still a useful information IMO, so it is worth it to output the matrices anyway with this warning
Benchmark Results (Julia v1)Time benchmarks
Memory benchmarks
|
|
There is an impact on |
The
infodictionary now includes the following fields in both methods::∇Jor:nablaJ: the gradient of the objective function:∇²Jor:nabla2J: the Hessian of the objective function:∇gor:nablag: the Jacobian of the inequality constraint:∇²ℓgor:nabla2lg: the Hessian of the inequality Lagrangian:∇geqor:nablageq: the Jacobian of the equality constraint:∇²ℓgeqor:nabla2lgeq: the Hessian of the equality LagrangianNote that Hessian of Lagrangians are not fully supported yet. Their nonzero coefficients are
random values for now. See Oscar's issue below to follow the updates about the required feature in
MathOptInterface.jl.I don't use the
DifferentiationInterface.jlpreparation mechanism, sincegetinfois only meant for troubleshooting and it's already a relatively expensive function that allocates the arrays at each call.