spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Pola Yao <pola....@gmail.com>
Subject How to force-quit a Spark application?
Date Tue, 15 Jan 2019 21:32:44 GMT
I submitted a Spark job through ./spark-submit command, the code was
executed successfully, however, the application got stuck when trying to
quit spark.

My code snippet:
'''
{

val spark = SparkSession.builder.master(...).getOrCreate

val pool = Executors.newFixedThreadPool(3)
implicit val xc = ExecutionContext.fromExecutorService(pool)
val taskList = List(train1, train2, train3)  // where train* is a Future
function which wrapped up some data reading and feature engineering and
machine learning steps
val results = Await.result(Future.sequence(taskList), 20 minutes)

println("Shutting down pool and executor service")
pool.shutdown()
xc.shutdown()

println("Exiting spark")
spark.stop()

}
'''

After I submitted the job, from terminal, I could see the code was executed
and printing "Exiting spark", however, after printing that line, it never
existed spark, just got stuck.

Does any body know what the reason is? Or how to force quitting?

Thanks!

Mime
View raw message