ConditionalJuMP.jl icon indicating copy to clipboard operation
ConditionalJuMP.jl copied to clipboard

Better ReLU formulation

Open rdeits opened this issue 8 years ago • 0 comments

As pointed out by @vtjeng, doing

y = @switch((x <= 0) => 0, (x >= 0) => x) 

results in a model whose convex relaxation isn't as tight as it should be (specifically, it misses the constraints that y >= 0 and y >= x). Can we do better?

I think the ideal result would be:

y >= x
y >= 0
y <= x + M(1 - z)
y <= Mz

rdeits avatar Oct 02 '17 21:10 rdeits