xls
xls copied to clipboard
JIT miscompare crasher 3ad6
Error is bit 51 is spuriously set in a tuple element of the return value.
Minimized IR:
package sample
file_number 0 "fake_file.x"
top fn __sample__main(x0: bits[59], x1: bits[52]) -> (bits[59], bits[28]) {
x4: bits[59] = literal(value=0, id=42, pos=0,4,21)
sign_ext.6: bits[59] = sign_ext(x1, new_bit_count=59, id=6)
x5: bits[1] = sgt(x4, sign_ext.6, id=7, pos=0,5,22)
concat.11: bits[2] = concat(x5, x5, id=11, pos=0,9,21)
x3: bits[59] = literal(value=576460752303423487, id=92, pos=0,3,16)
x7: bits[3] = concat(concat.11, x5, id=12, pos=0,9,30)
literal.113: bits[53] = literal(value=7, id=113)
x12: bits[53] = dynamic_bit_slice(x3, x7, width=53, id=18, pos=0,12,32)
x11: bits[53] = add(literal.113, x12, id=20, pos=0,13,33)
x15: bits[28] = dynamic_bit_slice(x11, x5, width=28, id=24, pos=0,17,22)
x18: bits[59] = zero_ext(x15, new_bit_count=59, id=33)
ret tuple.78: (bits[59], bits[28]) = tuple(x18, x15, id=78, pos=0,27,2)
}
Exception:
Result miscompare for sample 3:
args: bits[59]:0x2aa_aaaa_aaaa_aaaa; bits[52]:0xb_e286_8aaf_2ae8
evaluated opt IR (JIT), evaluated opt IR (interpreter), evaluated unopt IR (interpreter), interpreted DSLX =
(bits[59]:0x3, bits[28]:0x3, bits[1]:0x0)
evaluated unopt IR (JIT) =
(bits[59]:0x8_0000_0000_0003, bits[28]:0x3, bits[1]:0x0)
LLVM IR looks good. Final few lines of the optimized function:
...
%14 = and i32 %13, 268435455
%15 = zext i32 %14 to i64
%16 = insertvalue { i64, i32 } zeroinitializer, i64 %15, 0
%17 = insertvalue { i64, i32 } %16, i32 %14, 1
ret { i64, i32 } %17
The bad value is element 0 of the returned tuple. Not clear how bit 51 is set given that the value is produced by zext'ing an i32 to an i64. Could be a bug in LLVM backend. Explicitly checked the values written to the buffer (and stepped through the assembly) and the bit is set there so it's not an unpacking problem.