[ https://issues.apache.org/jira/browse/SPARK-26962?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Shiva Sankari Perambalam updated SPARK-26962:
---------------------------------------------
Description:
Using a Lead function on a DATETIME column is giving inconsistent results in Spark sql.
{code:java}
Lead(date) over (partition by id, code order by date){code}
where Date is DATETIME, id and code a String.
{code:java}
val testdf1= sparkSession.sql(s""" select date, lead(date) over (partition by id, code order
by date) as lead_date from foo"""){code}
The result set is sometimes having the same data as the date instead of the lead_date
was:
Using a Lead function on a DATETIME column is giving inconsistent results in Spark sql.
{code:java}
Lead(date) over (partition by id, code order by date){code}
where Date is DATETIME, id and code a String.
{code:java}
val testdf1= sparkSession.sql(s""" select date, lead(date) over (partition by id, code order
by date) as lead_date from <SOME_VIEW>"""){code}
The result set is sometimes having the same data as the date instead of the lead_date
> Windows Function LEAD in Spark SQL is not fetching consistent results.
> ----------------------------------------------------------------------
>
> Key: SPARK-26962
> URL: https://issues.apache.org/jira/browse/SPARK-26962
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.4.0
> Reporter: Shiva Sankari Perambalam
> Priority: Major
>
> Using a Lead function on a DATETIME column is giving inconsistent results in Spark
sql.
> {code:java}
> Lead(date) over (partition by id, code order by date){code}
> where Date is DATETIME, id and code a String.
> {code:java}
> val testdf1= sparkSession.sql(s""" select date, lead(date) over (partition by id, code
order by date) as lead_date from foo"""){code}
> The result set is sometimes having the same data as the date instead of the lead_date
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|