spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Cheng Lian (JIRA)" <j...@apache.org>
Subject [jira] [Created] (SPARK-5327) HiveCompatibilitySuite fails when executed against Hive 0.12.0
Date Mon, 19 Jan 2015 23:36:34 GMT
Cheng Lian created SPARK-5327:
---------------------------------

             Summary: HiveCompatibilitySuite fails when executed against Hive 0.12.0
                 Key: SPARK-5327
                 URL: https://issues.apache.org/jira/browse/SPARK-5327
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 1.3.0, 1.2.1
            Reporter: Cheng Lian


Git commit: e7884bc950a374408959b6118efe2c62fbe50608

Run the following SBT session to reproduce:
{code}
$ ./build/sbt -Pyarn,hive,hive-thriftserver,hive-0.12.0,hadoop-2.4,scala-2.10 -Dhadoop.version=2.4.1
...
> hive/test-only *.HiveCompatibilitySuite -- -z create_view_translate
...
[info] - create_view_translate *** FAILED *** (9 seconds, 216 milliseconds)
[info]   Failed to execute query using catalyst:
[info]   Error: Failed to parse: SELECT `items`.`id`, items`items`.`info`info['price'] FROM
`default`.`items`
[info]   org.apache.spark.sql.hive.HiveQl$ParseException: Failed to parse: SELECT `items`.`id`,
items`items`.`info`info['price'] FROM `default`.`items`
[info]          at org.apache.spark.sql.hive.HiveQl$.createPlan(HiveQl.scala:249)
[info]          at org.apache.spark.sql.hive.HiveQl$.createPlanForView(HiveQl.scala:275)
[info]          at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:151)
...
{code}
Seems that something went wrong when dealing with nested fields. Hive 0.13.1 is OK.

There are some other test cases also fail when executed against Hive 0.12.0. Will list them
later.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message