Andrew Or created SPARK-15511:
---------------------------------
Summary: Dropping data source table succeeds but throws exception
Key: SPARK-15511
URL: https://issues.apache.org/jira/browse/SPARK-15511
Project: Spark
Issue Type: Bug
Components: SQL
Affects Versions: 2.0.0
Reporter: Andrew Or
{code}
scala> sql("CREATE TABLE boxes (width INT, length INT, height INT) USING CSV")
{code}
{code}
scala> sql("DROP TABLE boxes")
16/05/24 13:30:50 WARN DropTableCommand: org.apache.spark.sql.AnalysisException: Path does
not exist: file:/user/hive/warehouse/boxes;
com.google.common.util.concurrent.UncheckedExecutionException: org.apache.spark.sql.AnalysisException:
Path does not exist: file:/user/hive/warehouse/boxes;
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:4882)
at com.google.common.cache.LocalCache$LocalLoadingCache.apply(LocalCache.java:4898)
at org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:170)
...
Caused by: org.apache.spark.sql.AnalysisException: Path does not exist: file:/user/hive/warehouse/boxes;
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:317)
at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$12.apply(DataSource.scala:306)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241)
at scala.collection.immutable.List.flatMap(List.scala:344)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306)
at org.apache.spark.sql.hive.HiveMetastoreCatalog$$anon$1.load(HiveMetastoreCatalog.scala:133)
at org.apache.spark.sql.hive.HiveMetastoreCatalog$$anon$1.load(HiveMetastoreCatalog.scala:69)
{code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org
|