Mx FP Quantization About Subnorm
When quantizing Mx fp, the quantization scales of subnormal and normal values should be different. Why does L394 clip to min_exp? I understand that it should clip to 1.
Looking forward to your reply
if exp_bits != 0:
private_exp = torch.floor(torch.log2(torch.abs(A) + (A == 0).type(A.dtype)))
# #The minimum representable exponent for 8 exp bits is -126
# min_exp = -(2 ** (exp_bits - 1)) + 2
# private_exp = private_exp.clip(min=min_exp)
# subnorm and norm part has different scale
# private_exp >= 1, norm scale
# private_exp < 1, subnorm scale
private_exp = private_exp.clip(min=1.0)
else:
private_exp = None
https://github.com/intel/neural-compressor/blob/master/neural_compressor/torch/algorithms/mx_quant/utils.py#L394 Image source
Hi @Jzz24 , sorry for my late reply.
bias = 2 ** (exp_bits - 1) - 1, so
if E > 0, exponent value is E - bias = E - (2 ** (exp_bits - 1) - 1) = E + 1 - 2 ** (exp_bits - 1), its minimum value is 2 - 2 ** (exp_bits - 1)
if E = 0, exponent value is 1 - bias = 1 - (2 ** (exp_bits - 1) - 1) = 2 - 2 ** (exp_bits - 1)
This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 7 days.