spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Naveen Kumar Pokala <>
Subject RE: Spark Job submit
Date Thu, 27 Nov 2014 06:54:24 GMT
Code is in my windows machine and cluster is in some other network in UNIX. In this case how
it will identify the cluster. In case of spark cluster we can clearly specify the URL like
spark://ip:port. But in case of hadoop how to specify that.

What I have done is copied the hadoop configuration files from network to local and created
dummy hadoop directory(in windows machine).

Submitted from spark submit by adding above dummy files location with HADOOP_CONF_DIR variable.
 Attaching the error.


Please suggest me how to proceed from the code and how to execute from spark submit from windows

Please provide me sample code if you have any.


From: Akhil Das []
Sent: Wednesday, November 26, 2014 10:03 PM
To: Naveen Kumar Pokala
Subject: Re: Spark Job submit

How about?

- Create a SparkContext
- setMaster as yarn-cluster
- Create a JavaSparkContext with the above SparkContext

And that will submit it to the yarn cluster.

Best Regards

On Wed, Nov 26, 2014 at 4:20 PM, Naveen Kumar Pokala <<>>

Is there a way to submit spark job on Hadoop-YARN  cluster from java code.


View raw message