spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sudhakar Thota (JIRA)" <j...@apache.org>
Subject [jira] [Commented] (SPARK-8333) Spark failed to delete temp directory created by HiveContext
Date Wed, 05 Aug 2015 20:01:04 GMT

    [ https://issues.apache.org/jira/browse/SPARK-8333?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14658776#comment-14658776
] 

Sudhakar Thota commented on SPARK-8333:
---------------------------------------

The above statements and scripts were run on OS X. 
Please run the script  on Windows7 and let me know. I can do the further exploration if it
fails.

Thanks
Sudhakar Thota

> Spark failed to delete temp directory created by HiveContext
> ------------------------------------------------------------
>
>                 Key: SPARK-8333
>                 URL: https://issues.apache.org/jira/browse/SPARK-8333
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.4.0
>         Environment: Windows7 64bit
>            Reporter: sheng
>            Priority: Minor
>              Labels: Hive, metastore, sparksql
>
> Spark 1.4.0 failed to stop SparkContext.
> {code:title=LocalHiveTest.scala|borderStyle=solid}
>  val sc = new SparkContext("local", "local-hive-test", new SparkConf())
>  val hc = Utils.createHiveContext(sc)
>  ... // execute some HiveQL statements
>  sc.stop()
> {code}
> sc.stop() failed to execute, it threw the following exception:
> {quote}
> 15/06/13 03:19:06 INFO Utils: Shutdown hook called
> 15/06/13 03:19:06 INFO Utils: Deleting directory C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea
> 15/06/13 03:19:06 ERROR Utils: Exception while deleting Spark temp dir: C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea
> java.io.IOException: Failed to delete: C:\Users\moshangcheng\AppData\Local\Temp\spark-d6d3c30e-512e-4693-a436-485e2af4baea
> 	at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:963)
> 	at org.apache.spark.util.Utils$$anonfun$1$$anonfun$apply$mcV$sp$5.apply(Utils.scala:204)
> 	at org.apache.spark.util.Utils$$anonfun$1$$anonfun$apply$mcV$sp$5.apply(Utils.scala:201)
> 	at scala.collection.mutable.HashSet.foreach(HashSet.scala:79)
> 	at org.apache.spark.util.Utils$$anonfun$1.apply$mcV$sp(Utils.scala:201)
> 	at org.apache.spark.util.SparkShutdownHook.run(Utils.scala:2292)
> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(Utils.scala:2262)
> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262)
> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1$$anonfun$apply$mcV$sp$1.apply(Utils.scala:2262)
> 	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1772)
> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply$mcV$sp(Utils.scala:2262)
> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262)
> 	at org.apache.spark.util.SparkShutdownHookManager$$anonfun$runAll$1.apply(Utils.scala:2262)
> 	at scala.util.Try$.apply(Try.scala:161)
> 	at org.apache.spark.util.SparkShutdownHookManager.runAll(Utils.scala:2262)
> 	at org.apache.spark.util.SparkShutdownHookManager$$anon$6.run(Utils.scala:2244)
> 	at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:54)
> {quote}
> It seems this bug is introduced by this SPARK-6907. In SPARK-6907, a local hive metastore
is created in a temp directory. The problem is the local hive metastore is not shut down correctly.
At the end of application,  if SparkContext.stop() is called, it tries to delete the temp
directory which is still used by the local hive metastore, and throws an exception.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message