spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Priyanka Garg (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-24703) Unable to multiply calendar interval
Date Thu, 01 Nov 2018 15:48:00 GMT

     [ https://issues.apache.org/jira/browse/SPARK-24703?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Priyanka Garg updated SPARK-24703:
----------------------------------
    Priority: Critical  (was: Major)

> Unable to multiply calendar interval
> ------------------------------------
>
>                 Key: SPARK-24703
>                 URL: https://issues.apache.org/jira/browse/SPARK-24703
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.3.1
>            Reporter: Priyanka Garg
>            Priority: Critical
>
> When i am trying to multiply calender interval with long/int , I am getting below error.
The same syntax is supported in Postgres.
> spark.sql("select  interval '1' day * 3").show()
> org.apache.spark.sql.AnalysisException: cannot resolve '(3 * interval 1 days)' due to
data type mismatch: differing types in '(interval 1 days) * 3' (int and calendarinterval).;
line 1 pos 7;
> 'Project [unresolvedalias((interval 1 days * 3) , None)]
> +- OneRowRelation
>  
>   at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
>   at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:93)
>   at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$2.applyOrElse(CheckAnalysis.scala:85)
>   at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:289)



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message