spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Hyukjin Kwon (Jira)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-31873) Spark Sql Function year does not extract year from date/timestamp
Date Sun, 31 May 2020 11:26:00 GMT

    [ https://issues.apache.org/jira/browse/SPARK-31873?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17120509#comment-17120509
] 

Hyukjin Kwon commented on SPARK-31873:
--------------------------------------

This will probably fixed in the master by switching the calendar from Gregoiran and Juillian
hybrid calendar to Proleptic Gregorian calendar at SPARK-26651 which I am sure that it's
very difficult to backport and introduces many behaviour changes.

> Spark Sql Function year does not extract year from date/timestamp
> -----------------------------------------------------------------
>
>                 Key: SPARK-31873
>                 URL: https://issues.apache.org/jira/browse/SPARK-31873
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.5
>            Reporter: Deepak Shingavi
>            Priority: Major
>
> There is a Spark SQL function
> org.apache.spark.sql.functions.year which fails in below case
>  
> {code:java}
> // Code to extract year from Timestamp
> val df = Seq(
>   ("1300-01-03 00:00:00")
> ).toDF("date_val")
>   .withColumn("date_val_ts", to_timestamp(col("date_val")))
>   .withColumn("year_val", year(to_timestamp(col("date_val"))))
> df.show()
> //Output of the above code
> +-------------------+-------------------+--------+
> |           date_val|        date_val_ts|year_val|
> +-------------------+-------------------+--------+
> |1300-01-03 00:00:00|1300-01-03 00:00:00|    1299|
> +-------------------+-------------------+--------+
> {code}
>  
> The above code works perfectly for all the years greater than 1300
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message