spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Marco Gaido <>
Subject Decimals
Date Tue, 12 Dec 2017 10:54:03 GMT
Hi all,

I saw in these weeks that there are a lot of problems related to decimal
values (SPARK-22036, SPARK-22755, for instance). Some are related to
historical choices, which I don't know, thus please excuse me if I am
saying dumb things:

 - why are we interpreting literal constants in queries as Decimal and not
as Double? I think it is very unlikely that a user can enter a number which
is beyond Double precision.
 - why are we returning null in case of precision loss? Is this approach
better than just giving a result which might loose some accuracy?


View raw message