spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Apache Spark (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-4120) Join of multiple tables with syntax like SELECT .. FROM T1,T2,T3.. does not work in SparkSQL
Date Wed, 29 Oct 2014 03:01:34 GMT

    [ https://issues.apache.org/jira/browse/SPARK-4120?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14187929#comment-14187929
] 

Apache Spark commented on SPARK-4120:
-------------------------------------

User 'ravipesala' has created a pull request for this issue:
https://github.com/apache/spark/pull/2987

> Join of multiple tables with syntax like SELECT .. FROM T1,T2,T3.. does not work in SparkSQL
> --------------------------------------------------------------------------------------------
>
>                 Key: SPARK-4120
>                 URL: https://issues.apache.org/jira/browse/SPARK-4120
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Ravindra Pesala
>            Assignee: Ravindra Pesala
>             Fix For: 1.2.0
>
>
> The queries with more than like 2 tables does not work. 
> {code}
> sql("SELECT * FROM records1 as a,records2 as b,records3 as c where a.key=b.key and a.key=c.key")
> {code}
> The above query gives following exception.
> {code}
> Exception in thread "main" java.lang.RuntimeException: [1.40] failure: ``UNION'' expected
but `,' found
> SELECT * FROM records1 as a,records2 as b,records3 as c where a.key=b.key and a.key=c.key
>                                        ^
> 	at scala.sys.package$.error(package.scala:27)
> 	at org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(SparkSQLParser.scala:33)
> 	at org.apache.spark.sql.SQLContext$$anonfun$1.apply(SQLContext.scala:75)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message