spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From ÐΞ€ρ@Ҝ (๏̯͡๏) <deepuj...@gmail.com>
Subject Re: Unable to start Pi (hello world) application on Spark 1.4
Date Mon, 29 Jun 2015 03:43:57 GMT
Figured it out.

All the jars that are specified with driver-class-path are now exported
through SPARK_CLASSPATH and its working now.

I thought SPARK_CLASSPATH was dead. Looks like its flipping ON/OFF

On Sun, Jun 28, 2015 at 12:55 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepujain@gmail.com> wrote:

> Any thoughts on this ?
>
> On Fri, Jun 26, 2015 at 2:27 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <deepujain@gmail.com>
> wrote:
>
>> It used to work with 1.3.1, however with 1.4.0 i get the following
>> exception
>>
>>
>> export SPARK_HOME=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4
>> export
>> SPARK_JAR=/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/spark-assembly-1.4.0-hadoop2.4.0.jar
>> export HADOOP_CONF_DIR=/apache/hadoop/conf
>> cd $SPARK_HOME
>> ./bin/spark-submit -v --master yarn-cluster --driver-class-path
>> /apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar:/apache/hadoop-2.4.1-2.1.3.0-2-EBAY/share/hadoop/yarn/lib/guava-11.0.2.jar
>> --jars
>> /apache/hadoop/lib/hadoop-lzo-0.6.0.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-api-jdo-3.2.6.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-core-3.2.10.jar,/home/dvasthimal/spark1.4/spark-1.4.0-bin-hadoop2.4/lib/datanucleus-rdbms-3.2.9.jar
>> --num-executors 1 --driver-memory 4g --driver-java-options
>> "-XX:MaxPermSize=2G" --executor-memory 2g --executor-cores 1 --queue
>> hdmi-express --class org.apache.spark.examples.SparkPi
>> ./lib/spark-examples*.jar 10
>>
>> *Exception*
>>
>> 15/06/26 14:24:42 INFO client.ConfiguredRMFailoverProxyProvider: Failing
>> over to rm2
>>
>> 15/06/26 14:24:42 WARN ipc.Client: Exception encountered while connecting
>> to the server : java.lang.IllegalArgumentException: Server has invalid
>> Kerberos principal: hadoop/X-Y-rm-2.vip.CM.com@CORP.CM.COM
>>
>>
>> I remember getting this error when working Spark 1.2.x where in the way i
>> used to get
>>
>> */apache/hadoop/share/hadoop/common/hadoop-common-2.4.1-EBAY-2.jar*
>>
>> this library into cp. With 1.3.1 using --driver-class-path gets it
>> running but with 1.4 it does not work
>>
>> Please suggest.
>>
>> --
>> Deepak
>>
>>
>
>
> --
> Deepak
>
>


-- 
Deepak

Mime
View raw message