Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix bug broadcasting constraints with sparse terms #2558

Merged
merged 3 commits into from
Sep 22, 2021

Conversation

odow
Copy link
Member

@odow odow commented Apr 15, 2021

src/macros.jl Show resolved Hide resolved
src/macros.jl Outdated
When broadcasting `f.(x)` over an `AbstractSparseArray` `x`, Julia first calls
the equivalent of `f(zero(eltype(x))`. However, if `f` is mutating, this can
have serious consequences! In our case, broadcasting `build_constraint` will add
a new `0 = 0` constraint.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does collect help ? It also replaces the structural zero elements with zeros

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In most cases (all cases?), promotion means that the incoming SparseArray is actually dense.

In 0.21.7, you get

julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> d = sparsevec([1, 3], x)
3-element SparseVector{VariableRef, Int64} with 2 stored entries:
  [1]  =  x[1]
  [3]  =  x[2]

julia> @constraint(model, d .== 0)
3-element SparseVector{Any, Int64} with 3 stored entries:
  [1]  =  x[1] = 0.0
  [2]  =  0 = 0.0
  [3]  =  x[2] = 0.0

julia> model
A JuMP Model
Feasibility problem with:
Variables: 2
`AffExpr`-in-`MathOptInterface.EqualTo{Float64}`: 4 constraints
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
Names registered in the model: x

In this PR, you get

julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.

julia> @variable(model, x[1:2])
2-element Vector{VariableRef}:
 x[1]
 x[2]

julia> d = sparsevec([1, 3], x)
3-element SparseVector{VariableRef, Int64} with 2 stored entries:
  [1]  =  x[1]
  [3]  =  x[2]

julia> @constraint(model, d .== 0)
3-element Vector{ConstraintRef{Model, MathOptInterface.ConstraintIndex{MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.EqualTo{Float64}}, ScalarShape}}:
 x[1] = 0.0
 0 = 0.0
 x[2] = 0.0

julia> model
A JuMP Model
Feasibility problem with:
Variables: 2
`AffExpr`-in-`MathOptInterface.EqualTo{Float64}`: 3 constraints
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
Names registered in the model: x

Is changing the return type of @constraint here breaking? The type change happens with dense constraints too.

test/macros.jl Show resolved Hide resolved
@codecov
Copy link

codecov bot commented Apr 15, 2021

Codecov Report

Merging #2558 (02a3ac4) into master (3ebf94d) will increase coverage by 0.00%.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master    #2558   +/-   ##
=======================================
  Coverage   93.44%   93.45%           
=======================================
  Files          43       43           
  Lines        5404     5406    +2     
=======================================
+ Hits         5050     5052    +2     
  Misses        354      354           
Impacted Files Coverage Δ
src/macros.jl 92.98% <100.00%> (+0.02%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 3ebf94d...02a3ac4. Read the comment docs.

src/macros.jl Outdated Show resolved Hide resolved
@odow
Copy link
Member Author

odow commented Apr 16, 2021

What was the conclusion on changing the return type of @constraint?

@blegat
Copy link
Member

blegat commented Apr 18, 2021

I wouldn't consider it breaking. Having a sparse vector of constraint refs can be considered a bug for 2 reasons: 1) zero is not defined for constraint refs. 2) this is dense so it does not make sense to returns it sparse

@odow odow added the breaking label Apr 21, 2021
@odow odow merged commit 7037d5b into master Sep 22, 2021
@odow odow deleted the od/fix-sparse-broadcast branch September 22, 2021 23:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Development

Successfully merging this pull request may close these issues.

Better error message for sparse RHS terms in vector constraints
2 participants