spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jeff Zhang <zjf...@gmail.com>
Subject Re: Dynamic allocation Spark
Date Fri, 26 Feb 2016 11:33:13 GMT
Check the RM UI to ensure you have available resources. I suspect it might
be that you didn't configure yarn correctly, so NM didn't start properly
and you have no resource.

On Fri, Feb 26, 2016 at 7:14 PM, alvarobrandon <alvarobrandon@gmail.com>
wrote:

> Hello everyone:
>
> I'm trying the dynamic allocation in Spark with YARN. I have followed the
> following configuration steps:
> 1. Copy the spark-*-yarn-shuffle.jar to the nodemanager classpath. "cp
> /opt/spark/lib/spark-*-yarn-shuffle.jar /opt/hadoop/share/hadoop/yarn"
> 2. Added the shuffle service of spark in yarn-site.xml
> <property>
>     <name>yarn.nodemanager.aux-services</name>
>     <value>mapreduce_shuffle,spark_shuffle</value>
>     <description>shuffle implementation</description>
>   </property>
> 3. Enabled the class for the shuffle service in yarn-site.xml
>   <property>
>     <name>yarn.nodemanager.aux-services.spark_shuffle.class</name>
>     <value>org.apache.spark.network.yarn.YarnShuffleService</value>
>     <description>enable the class for dynamic allocation</description>
>   </property>
> 4. Activated the dynamic allocation in the spark defaults
> spark.dynamicAllocation.enabled         true
> spark.shuffle.service.enabled   true
>
> When I launch my application it just stays in the queue accepted but it
> never actually runs.
> 16/02/26 11:11:46 INFO yarn.Client: Application report for
> application_1456482268159_0001 (state: ACCEPTED)
>
> Am I missing something?
>
> Thanks in advance as always
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Dynamic-allocation-Spark-tp26344.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>


-- 
Best Regards

Jeff Zhang

Mime
View raw message