spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shushant Arora <shushantaror...@gmail.com>
Subject spark on yarn
Date Tue, 14 Jul 2015 16:57:03 GMT
I am running spark application on yarn managed cluster.

When I specify --executor-cores > 4 it fails to start the application.
I am starting the app as

spark-submit --class classname --num-executors 10 --executor-cores
5 --master masteradd jarname

Exception in thread "main" org.apache.spark.SparkException: Yarn
application has already ended! It might have been killed or unable to
launch application master.

When I give --executor-cores as 4 , it works fine.

My Cluster has 10 nodes .
Why am I not able to specify more than 4 concurrent tasks. Is there any max
limit yarn side or spark side which I can override to make use of more
tasks ?

Mime
View raw message