spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Taeyun Kim <>
Subject Spark does not delete temporary directories
Date Thu, 07 May 2015 06:39:32 GMT


After a spark program completes, there are 3 temporary directories remain in
the temp directory.

The file names are like this: spark-2e389487-40cc-4a82-a5c7-353c0feefbb7


And the Spark program runs on Windows, a snappy DLL file also remains in the
temp directory.

The file name is like this:


They are created every time the Spark program runs. So the number of files
and directories keeps growing.


How can let them be deleted?


Spark version is 1.3.1 with Hadoop 2.6.





View raw message