spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Chen Jingci <>
Subject RE: Problem with "delete spark temp dir" on spark 0.8.1
Date Tue, 04 Mar 2014 11:11:54 GMT
Hi, I also encounter the same problem when I run locally. But when I run on cluster, everything
is fine. Then I run locally again without the jars parameter, the exception disappears.

Best regards,
Chen jingci
--sent from phone, sorry for the typo

-----Original Message-----
From: "goi cto" <>
Sent: ‎4/‎3/‎2014 17:55
To: "" <>
Subject: Re: Problem with "delete spark temp dir" on spark 0.8.1

Exception in thread "delete Spark temp dir C:\Users\..." failed to delete:
 at org.apache.spark.util.utils$.deleteRecursively(Utils.scala:495)
 at org.apache.spark.util.utils$$anonfun$deleteRecursively$1.apply(Utils.scala:491)

I deleted my temp dir as suggested and indeed all spark.. directories were deleted. after
which I run the program again and got the same error again. also indeed a spark-... directory
with the "simple-project-1.0.jar" was found left on the file system. 
I had no problem deleting this file once the program completed. 


On Tue, Mar 4, 2014 at 11:36 AM, Akhil Das <> wrote:


Try to clean your temp dir, System.getProperty("")

Also, Can you paste a longer stacktrace?

Best Regards

On Tue, Mar 4, 2014 at 2:55 PM, goi cto <> wrote:


I am running a spark java program on a local machine. when I try to write the output to a
file (RDD.SaveAsTextFile) I am getting this exception:

Exception in thread "Delete Spark temp dir ..."

This is running on my local window machine.

Any ideas?


Eran | CTO 


Best Regards


Eran | CTO 
View raw message