ConditionalJuMP.jl
ConditionalJuMP.jl copied to clipboard
Better ReLU formulation
As pointed out by @vtjeng, doing
y = @switch((x <= 0) => 0, (x >= 0) => x)
results in a model whose convex relaxation isn't as tight as it should be (specifically, it misses the constraints that y >= 0 and y >= x). Can we do better?
I think the ideal result would be:
y >= x
y >= 0
y <= x + M(1 - z)
y <= Mz