Inconsistent MWIS results between Integer Programming and GenericTensorNetwork
@GiggleLiu I observed inconsistent maximum weighted independent set (MWIS) solutions between two methods applied to the same graph and weight vector:
- Method 1: Integer Programming (
JuMP + GLPK) - Method 2:
GenericTensorNetwork(IndependentSet(...))
The two methods produce different sets of optimal configurations, even though both should theoretically enumerate all ground states with maximum weight.
🧪 Example (Requires code that has not yet been merged)
House Graph Setup
using UnitDiskMapping, Graphs, GenericTensorNetworks, LinearAlgebra
graph = smallgraph(:house)
triangular_weighted_res = map_graph(TriangularWeighted(), graph; vertex_order=MinhThiTrick())
g, _ = graph_and_weights(triangular_weighted_res.grid_graph)
source_weights = fill(0.1, nv(graph))
mapped_weights10 = map_weights(triangular_weighted_res, source_weights)
pins = [82, 63, 28, 5, 1] # original node locations
⸻
✅ Method 1: Integer Programming
using JuMP, GLPK
function all_maximum_weighted_independent_sets(g::SimpleGraph, weights::Vector{Float64})
n = nv(g)
@assert length(weights) == n
model = Model(GLPK.Optimizer)
set_silent(model)
@variable(model, x[1:n], Bin)
for e in edges(g)
@constraint(model, x[src(e)] + x[dst(e)] <= 1)
end
@objective(model, Max, sum(weights[i] * x[i] for i in 1:n))
optimize!(model)
if termination_status(model) != MOI.OPTIMAL
error("No optimal solution found")
end
max_weight = objective_value(model)
solutions = Set{Vector{Int}}()
while true
if termination_status(model) != MOI.OPTIMAL || abs(objective_value(model) - max_weight) > 1e-6
break
end
current = [i for i in 1:n if value(x[i]) ≥ 0.5]
push!(solutions, sort(current))
@constraint(model, sum(x[i] for i in current) <= length(current) - 1)
optimize!(model)
end
return collect(solutions), max_weight
end
res_ip, _ = all_maximum_weighted_independent_sets(g, mapped_weights10)
bit_vectors_ip = []
for c in res_ip
bit_vector = zeros(Int, nv(graph))
for (j, pin) in enumerate(pins)
if pin in c
bit_vector[j] = 1
end
end
push!(bit_vectors_ip, bit_vector)
end
⸻
❓ Method 2: GenericTensorNetwork
res_gtn = solve(GenericTensorNetwork(IndependentSet(g, mapped_weights10)), ConfigsMax())[].c.data
bit_vectors_gtn = []
for c in res_gtn
bit_vector = map_config_back(triangular_weighted_res, c)
push!(bit_vectors_gtn, bit_vector)
end
⸻
🧮 Observed Result
-
bit_vectors_ipandbit_vectors_gtncontain different configurations.
bit_vector_ip =
[0, 1, 1, 0, 0]
[1, 0, 0, 0, 1]
[1, 0, 0, 1, 0]
[0, 1, 0, 0, 1]
bit_vector_gtn =
[1, 0, 0, 1, 0]
[0, 1, 1, 0, 0]
Some MWIS solutions returned by the Integer Programming method are missing from the GTN result.
- Both of the two methods return right result on the source graph.
res, max_weight = all_maximum_weighted_independent_sets(graph, source_weights)
solve(GenericTensorNetwork(IndependentSet(graph, source_weights)), ConfigsMax())[]
Since the triangular lattice support has not yet been merged into the repository, similar results can be observed using the existing Weighted mode.
- This code can be run directly.
using UnitDiskMapping, Graphs, GenericTensorNetworks, LinearAlgebra
using JuMP, GLPK
function all_maximum_weighted_independent_sets(g::SimpleGraph, weights::Vector{Float64})
n = nv(g)
@assert length(weights) == n
model = Model(GLPK.Optimizer)
set_silent(model)
@variable(model, x[1:n], Bin)
for e in edges(g)
@constraint(model, x[src(e)] + x[dst(e)] <= 1)
end
@objective(model, Max, sum(weights[i] * x[i] for i in 1:n))
optimize!(model)
if termination_status(model) != MOI.OPTIMAL
error("No optimal solution found")
end
max_weight = objective_value(model)
solutions = Set{Vector{Int}}()
while true
if termination_status(model) != MOI.OPTIMAL || abs(objective_value(model) - max_weight) > 1e-6
break
end
# `current` is the current optimal solution
current = [i for i in 1:n if value(x[i]) ≥ 0.5]
push!(solutions, sort(current))
# Add a new constraint to exclude the current solution
@constraint(model, sum(x[i] for i in current) <= length(current) - 1)
# Continue solving
optimize!(model)
end
return collect(solutions), max_weight
end
graph = smallgraph(:cubical)
weighted_res = map_graph(Weighted(), graph; vertex_order=MinhThiTrick());
source_weights = fill(0.87, nv(graph))
mapped_weights = map_weights(weighted_res, source_weights)
# TEST on MAPPED Graph
g, _ = graph_and_weights(weighted_res.grid_graph)
res_gtn = solve(GenericTensorNetwork(IndependentSet(g, mapped_weights)), ConfigsMax())[].c.data
@show length(res_gtn) # result = 1
res, max_weight = all_maximum_weighted_independent_sets(g, mapped_weights)
@show length(res) # result = 2
# TEST on SOURCE Graph
source_res_gtn = solve(GenericTensorNetwork(IndependentSet(graph, source_weights)), ConfigsMax())[].c.data
@show length(source_res_gtn) # result = 2
source_res_ip, max_weight = all_maximum_weighted_independent_sets(graph, source_weights)
@show length(source_res_ip) # result = 2
Wield, I can get correct results on my machine.
julia> res_gtn = solve(GenericTensorNetwork(IndependentSet(g, mapped_weights10)), ConfigsMax())[].c.data
4-element Vector{StaticBitVector{176, 3}}:
01010010101010101010100110011001100110011001100010011010101010101010101010101001011001100011100110011001100110101010101010101001100110011001100110011010101010101010101010101010
10110100101010101010110011001100110011010011001101100101010101010001010101010110100011001010110011001100110010101010101011100011001100110011001100110010101010101010101010101010
10110100101010101010110011001100110011010011001101100101010101010001010101010110100011001010110011001100110001010101010101010110011001100110011001100101010101010101010101010101
01001101010101010101001100110011001100101100110001100101010101010101010101010100100110011100011001100110011001010101010101010110011001100110011001100101010101010101010101010101
julia> bit_vectors_gtn = []
Any[]
julia> for c in res_gtn
bit_vector = map_config_back(triangular_weighted_res, c)
push!(bit_vectors_gtn, bit_vector)
end
julia> bit_vectors_gtn
4-element Vector{Any}:
[0, 1, 1, 0, 0]
[0, 1, 0, 0, 1]
[1, 0, 0, 0, 1]
[1, 0, 0, 1, 0]
I guess, it may due to the floating point number. The degeneracy is not counted correctly.
What is the version of your GenericTensorNetworks? I am using 4.0.1
I'm also using 4.0.1
I think lossing some degenerate configurations is possible. It is realted to the definition of CountingTropical algebra. When you have floating point size, the counting is nolonger reliable.
https://github.com/TensorBFS/TropicalNumbers.jl/blob/16c644484f25e5b38dd6a0ed0bc96f58065c2c40/src/counting_tropical.jl#L42
I think lossing some degenerate configurations is possible. It is realted to the definition of
CountingTropicalalgebra. When you have floating point size, the counting is nolonger reliable.https://github.com/TensorBFS/TropicalNumbers.jl/blob/16c644484f25e5b38dd6a0ed0bc96f58065c2c40/src/counting_tropical.jl#L42
I found that the code runs correctly in the terminal, but not when using Shift+Enter (the Julia extension’s inline execution mode)
Can you try this one?
# fix the ambiguity error (should be a part of TropicalNumbers)
Base.:^(a::Tropical, b::Rational) = Tropical(a.n * b)
Base.:^(a::CountingTropical, b::Rational) = CountingTropical(a.n * b, a.c ^ b)
res_gtn = solve(GenericTensorNetwork(IndependentSet(g, Int(mapped_weights10 .* 10) // 10 )), ConfigsMax(); T=Rational{Int})[].c.data
Now it gives me too many solutions.
Well, integer is the most reliable:
julia> res_gtn = solve(GenericTensorNetwork(IndependentSet(g, round.(Int, mapped_weights10 .* 10))), ConfigsMax())[].c.data
4-element Vector{StaticBitVector{176, 3}}:
01010010101010101010100110011001100110011001100010011010101010101010101010101001011001100011100110011001100110101010101010101001100110011001100110011010101010101010101010101010
10110100101010101010110011001100110011010011001101100101010101010001010101010110100011001010110011001100110010101010101011100011001100110011001100110010101010101010101010101010
10110100101010101010110011001100110011010011001101100101010101010001010101010110100011001010110011001100110001010101010101010110011001100110011001100101010101010101010101010101
01001101010101010101001100110011001100101100110001100101010101010101010101010100100110011100011001100110011001010101010101010110011001100110011001100101010101010101010101010101
Well, integer is the most reliable:
julia> res_gtn = solve(GenericTensorNetwork(IndependentSet(g, round.(Int, mapped_weights10 .* 10))), ConfigsMax())[].c.data 4-element Vector{StaticBitVector{176, 3}}: 01010010101010101010100110011001100110011001100010011010101010101010101010101001011001100011100110011001100110101010101010101001100110011001100110011010101010101010101010101010 10110100101010101010110011001100110011010011001101100101010101010001010101010110100011001010110011001100110010101010101011100011001100110011001100110010101010101010101010101010 10110100101010101010110011001100110011010011001101100101010101010001010101010110100011001010110011001100110001010101010101010110011001100110011001100101010101010101010101010101 01001101010101010101001100110011001100101100110001100101010101010101010101010100100110011100011001100110011001010101010101010110011001100110011001100101010101010101010101010101
That fix it.
We should warn users if the input weight is not integer, and the CountingTropical type is used.