spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Wenchen Fan (Jira)" <>
Subject [jira] [Assigned] (SPARK-28495) Introduce ANSI store assignment policy for table insertion
Date Tue, 27 Aug 2019 14:22:00 GMT


Wenchen Fan reassigned SPARK-28495:

    Assignee: Gengliang Wang

> Introduce ANSI store assignment policy for table insertion
> ----------------------------------------------------------
>                 Key: SPARK-28495
>                 URL:
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Gengliang Wang
>            Assignee: Gengliang Wang
>            Priority: Major
> In Spark version 2.4 and earlier, when inserting into a table, Spark will cast the data
type of input query to the data type of target table by coercion. This can be super confusing,
e.g. users make a mistake and write string values to an int column.
> In data source V2,  by default, only upcasting is allowed when inserting data into a
table. E.g. int -> long and int -> string are allowed, while decimal -> double or
long -> int are not allowed. The rules of UpCast was originally created for Dataset type
coercion. They are quite strict and different from the behavior of all existing popular DBMS.
This is breaking change. It is possible that existing queries are broken after 3.0 releases.
> Following ANSI SQL standard makes Spark consistent with the table insertion behaviors
of popular DBMS like PostgreSQL/Oracle/Mysql.
> For more details, see the discussion on
and .
> This task is to add ANSI store assignment policy as a new option for the configuration

This message was sent by Atlassian Jira

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message