Skip to content

Pass attributes through Objective.FunctionConversionBridge #287

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

blegat
Copy link
Member

@blegat blegat commented Apr 17, 2025

@andrewrosemberg Can you check if this fixes your issue ?

  • Needs tests

Closes #285

Copy link

codecov bot commented Apr 17, 2025

Codecov Report

Attention: Patch coverage is 0% with 4 lines in your changes missing coverage. Please review.

Project coverage is 88.90%. Comparing base (43a8226) to head (c254bc1).

Files with missing lines Patch % Lines
src/bridges.jl 0.00% 4 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #287      +/-   ##
==========================================
- Coverage   89.08%   88.90%   -0.19%     
==========================================
  Files          15       15              
  Lines        1969     1973       +4     
==========================================
  Hits         1754     1754              
- Misses        215      219       +4     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@andrewrosemberg
Copy link
Collaborator

andrewrosemberg commented Apr 17, 2025

I am still getting the same error (I think) in your branch using the example from the issue:

julia> DiffOpt.forward_differentiate!(model)
ERROR: MethodError: no method matching throw_set_error_fallback(::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, ::DiffOpt.ObjectiveFunctionAttribute{…}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{…}, ::MathOptInterface.ScalarAffineFunction{…})

Closest candidates are:
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::Union{MathOptInterface.AbstractModelAttribute, MathOptInterface.AbstractOptimizerAttribute}, ::Any; error_if_supported)
   @ MathOptInterface ~/.julia/packages/MathOptInterface/jGuEH/src/attributes.jl:580
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::MathOptInterface.ConstraintFunction, ::MathOptInterface.ConstraintIndex{F, S}, ::F; error_if_supported) where {F<:MathOptInterface.AbstractFunction, S}
   @ MathOptInterface ~/.julia/packages/MathOptInterface/jGuEH/src/attributes.jl:1896
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::MathOptInterface.ConstraintFunction, ::MathOptInterface.ConstraintIndex, ::MathOptInterface.AbstractFunction; kwargs...)
   @ MathOptInterface ~/.julia/packages/MathOptInterface/jGuEH/src/attributes.jl:1908
  ...

Stacktrace:
 [1] set(::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, ::DiffOpt.ObjectiveFunctionAttribute{…}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{…}, ::MathOptInterface.ScalarAffineFunction{…})
   @ MathOptInterface ~/.julia/packages/MathOptInterface/jGuEH/src/attributes.jl:553
 [2] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, attr::DiffOpt.ObjectiveFunctionAttribute{…}, value::MathOptInterface.ScalarAffineFunction{…})
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/copy_dual.jl:51
 [3] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, attr::DiffOpt.ForwardObjectiveFunction, value::MathOptInterface.ScalarAffineFunction{…})
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/copy_dual.jl:71
 [4] forward_differentiate!(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{…}})
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/moi_wrapper.jl:564
 [5] forward_differentiate!
   @ ~/Workspace/DiffOpt.jl/src/jump_moi_overloads.jl:393 [inlined]
 [6] forward_differentiate!(model::MathOptInterface.Utilities.CachingOptimizer{…})
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/jump_moi_overloads.jl:378
 [7] forward_differentiate!(model::Model)
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/jump_moi_overloads.jl:363
 [8] top-level scope
   @ REPL[48]:1
Some type information was truncated. Use `show(err)` to see complete types.

@blegat
Copy link
Member Author

blegat commented Apr 18, 2025

Should be fixed now

@andrewrosemberg
Copy link
Collaborator

andrewrosemberg commented Apr 18, 2025

First example runs with no errors!
For the second example, I now get (even after calling JuMP.optimize!):

julia> MOI.set.(
           model,
           DiffOpt.ForwardConstraintFunction(),
           c3,
           sum(x) + 1,
       )

julia> DiffOpt.forward_differentiate!(model)
ERROR: Trying to compute the forward differentiation on a model with termination status OPTIMIZE_NOT_CALLED
Stacktrace:
 [1] error(s::String)
   @ Base ./error.jl:35
 [2] forward_differentiate!(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{…}})
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/moi_wrapper.jl:553
 [3] forward_differentiate!
   @ ~/Workspace/DiffOpt.jl/src/jump_moi_overloads.jl:393 [inlined]
 [4] forward_differentiate!(model::MathOptInterface.Utilities.CachingOptimizer{…})
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/jump_moi_overloads.jl:378
 [5] forward_differentiate!(model::Model)
   @ DiffOpt ~/Workspace/DiffOpt.jl/src/jump_moi_overloads.jl:363
 [6] top-level scope
   @ REPL[45]:1
Some type information was truncated. Use `show(err)` to see complete types.

But this could be a different issue!

@blegat
Copy link
Member Author

blegat commented Apr 18, 2025

I got the same error, this is because

MOI.set.(
           model,
           DiffOpt.ForwardConstraintFunction(),
           c3,
           sum(x) + 1,
       )

is incorrect (the correct one is given in #285 (comment)). First, no need to broadcast because every argument is scalar.
Second, you are setting a ScalarAffineFunction to a MOI.ConstraintIndex{MOI.VectorAffineFunction}. As this does not match any of our methods, we throw an error, but MOI.supports returns true so the error is a NotAllowedError, not and Unsupported so the cache thinks that reseting the optimizer will solver it since the attribute is supported, the operation is just not allowed because of some internal state. But actually, that's not the case, the user should receive an ArgumentError instead. We should probably add an helpful error message in a separate PR.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

Error DiffOpt.forward_differentiate! from PSD Conic JuMP problems
2 participants