spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 王长春 <251922...@qq.com>
Subject 【The decimal result is incorrectly enlarged by 100 times】
Date Tue, 20 Oct 2020 15:09:04 GMT
Hi ,
I have came across a problem about correctness of spark decimal, and I have researched for
it a few days. This problem is very curious.

My spark version is spark 2.3.1

I have a sql like this:
Create table table_S stored as orc as 
Select a*b*c from table_a
Union all
Select d from table_B
Union all
Select e from table_C

Column a b c are all decimal(38,4)
Column d is also decimal(38,4)
Column e is also decimal(38,4)

The result of this sql is wrong ,The result is 100 times greater than the correct value.

The weird thing is :If I delete “create table” clause , the result is correct.
And I change the order of union , the result is also correct.
E.g

Create table table_S stored as orc as 
Select d from table_B
Union all
Select a*b*c from table_a
Union all
Select e from table_C


Besides , spark 2.3.2 can gave correct result in this case. But I checked all the patch of
2.3.2, can not find which patch solve this problem.


Can anyone gave some Help? Has anyone encountered the same problem?



---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message