spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Roman Pastukhov <metaignat...@gmail.com>
Subject Is shutting down of SparkContext optional?
Date Wed, 19 Mar 2014 14:45:48 GMT
Hi,

After switching from Spark 0.8.0 to Spark 0.9.0 (and to Scala 2.10) one
application started hanging after main thread is done (in 'local[2]' mode,
without a cluster).

Adding SparkContext.stop() at the end solves this.
Is this behavior normal and shutting down of SparkContext is required?

Mime
View raw message