tispark icon indicating copy to clipboard operation
tispark copied to clipboard

Decimal truncate strategy is incorrect

Open birdstorm opened this issue 7 years ago • 3 comments

Previously, truncating decimal type for precision greater than 38 will keep the scale and truncate the precision to 38 only. It helps to keep the data's precision as much as possible. However it also turns some decimal types into null if truncated_precision - scale is less than data's precision - scale. When the according column is a primary key / non-null column, it will cause problems.

birdstorm avatar Oct 30 '18 09:10 birdstorm

Talked with Spark team and we might consider PR for larger range of decimal as long as keep compatible with older version of Spark (both behavior and performance). @birdstorm

ilovesoup avatar Nov 14 '18 14:11 ilovesoup

Is the Spark team doing this?

shiyuhang0 avatar Apr 02 '22 08:04 shiyuhang0

/lifecycle frozen

shiyuhang0 avatar Apr 27 '22 09:04 shiyuhang0