spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Udbhav Agrawal (Jira)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-29600) array_contains built in function is not backward compatible in 3.0
Date Fri, 25 Oct 2019 06:30:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-29600?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16959467#comment-16959467
] 

Udbhav Agrawal commented on SPARK-29600:
----------------------------------------

I will check this issue


> array_contains built in function is not backward compatible in 3.0
> ------------------------------------------------------------------
>
>                 Key: SPARK-29600
>                 URL: https://issues.apache.org/jira/browse/SPARK-29600
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Major
>
> SELECT array_contains(array(0,0.1,0.2,0.3,0.5,0.02,0.033), .2); throws Exception in 3.0
where as in 2.3.2 is working fine.
> Spark 3.0 output:
> 0: jdbc:hive2://10.18.19.208:23040/default> SELECT array_contains(array(0,0.1,0.2,0.3,0.5,0.02,0.033),
.2);
> Error: org.apache.spark.sql.AnalysisException: cannot resolve 'array_contains(array(CAST(0
AS DECIMAL(13,3)), CAST(0.1BD AS DECIMAL(13,3)), CAST(0.2BD AS DECIMAL(13,3)), CAST(0.3BD
AS DECIMAL(13,3)), CAST(0.5BD AS DECIMAL(13,3)), CAST(0.02BD AS DECIMAL(13,3)), CAST(0.033BD
AS DECIMAL(13,3))), 0.2BD)' due to data type mismatch: Input to function array_contains should
have been array followed by a value with same element type, but it's [array<decimal(13,3)>,
decimal(1,1)].; line 1 pos 7;
> 'Project [unresolvedalias(array_contains(array(cast(0 as decimal(13,3)), cast(0.1 as
decimal(13,3)), cast(0.2 as decimal(13,3)), cast(0.3 as decimal(13,3)), cast(0.5 as decimal(13,3)),
cast(0.02 as decimal(13,3)), cast(0.033 as decimal(13,3))), 0.2), None)]
> +- OneRowRelation (state=,code=0)
> 0: jdbc:hive2://10.18.19.208:23040/default> ne 1 pos 7;
> Error: org.apache.spark.sql.catalyst.parser.ParseException:
>  
> mismatched input 'ne' expecting \{'(', 'ADD', 'ALTER', 'ANALYZE', 'CACHE', 'CLEAR', 'COMMIT',
'CREATE', 'DELETE', 'DESC', 'DESCRIBE', 'DFS', 'DROP', 'EXPLAIN', 'EXPORT', 'FROM', 'GRANT',
'IMPORT', 'INSERT', 'LIST', 'LOAD', 'LOCK', 'MAP', 'MSCK', 'REDUCE', 'REFRESH', 'REPLACE',
'RESET', 'REVOKE', 'ROLLBACK', 'SELECT', 'SET', 'SHOW', 'START', 'TABLE', 'TRUNCATE', 'UNCACHE',
'UNLOCK', 'USE', 'VALUES', 'WITH'}(line 1, pos 0)
> == SQL ==
> ne 1 pos 7
> ^^^ (state=,code=0)
>  
> Spark 2.3.2 output
> 0: jdbc:hive2://10.18.18.214:23040/default> SELECT array_contains(array(0,0.1,0.2,0.3,0.5,0.02,0.033),
.2);
> +---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--+
> | array_contains(array(CAST(0 AS DECIMAL(13,3)), CAST(0.1 AS DECIMAL(13,3)), CAST(0.2
AS DECIMAL(13,3)), CAST(0.3 AS DECIMAL(13,3)), CAST(0.5 AS DECIMAL(13,3)), CAST(0.02 AS DECIMAL(13,3)),
CAST(0.033 AS DECIMAL(13,3))), CAST(0.2 AS DECIMAL(13,3))) |
> +---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--+
> | true |
> +---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+--+
> 1 row selected (0.18 seconds)
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message