spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <r...@databricks.com>
Subject Re: Decimals
Date Wed, 13 Dec 2017 08:08:30 GMT
Responses inline

On Tue, Dec 12, 2017 at 2:54 AM, Marco Gaido <marcogaido91@gmail.com> wrote:

> Hi all,
>
> I saw in these weeks that there are a lot of problems related to decimal
> values (SPARK-22036, SPARK-22755, for instance). Some are related to
> historical choices, which I don't know, thus please excuse me if I am
> saying dumb things:
>
>  - why are we interpreting literal constants in queries as Decimal and not
> as Double? I think it is very unlikely that a user can enter a number which
> is beyond Double precision.
>

Probably just to be consistent with some popular databases.



>  - why are we returning null in case of precision loss? Is this approach
> better than just giving a result which might loose some accuracy?
>

The contract with decimal is that it should never lose precision (it is
created for financial reports, accounting, etc). Returning null is at least
telling the user the data type can no longer support the precision required.



>
> Thanks,
> Marco
>

Mime
View raw message