spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From anu <anamika.guo...@gmail.com>
Subject Re: SparkSQL Timestamp query failure
Date Mon, 30 Mar 2015 11:29:49 GMT
Hi Alessandro

Could you specify which query were you able to run successfully?

1. sqlContext.sql("SELECT * FROM Logs as l where l.timestamp = '2012-10-08
16:10:36' ").collect 

OR

2. sqlContext.sql("SELECT * FROM Logs as l where cast(l.timestamp as string)
= '2012-10-08 16:10:36.0').collect 

I am able to run only the second query, i.e. the one with timestamp casted
to string. What is the use of even parsing my data to store timestamp values
when I can't do >= and <= comparisons on timestamp?? 

In the above query, I am ultimately doing string comparisons, while I
actually want to do comparison on timestamp values.

*My Spark version is 1.1.0*

Please somebody clarify why am I not able to perform queries like
Select * from table1 where endTime >= '2015-01-01 00:00:00' and endTime <=
'2015-01-10 00:00:00' 

without getting anything in the output.

Even, the following doesn't work
Select * from table1 where endTime >=  CAST('2015-01-01 00:00:00' as
timestamp) and endTime <= CAST('2015-01-10 00:00:00' as timestamp)

I get the this error :  *java.lang.RuntimeException: [1.99] failure:
``STRING'' expected but identifier timestamp found*

Thanks



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/SparkSQL-Timestamp-query-failure-tp19502p22292.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message