spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Naveen Kumar Pokala <>
Subject Execute Spark programs from local machine on Yarn-hadoop cluster
Date Fri, 21 Nov 2014 14:39:04 GMT

I am executing my spark jobs on yarn cluster by forming conf object in the following way.

SparkConf conf = new SparkConf().setAppName("NewJob").setMaster("yarn-cluster");

Now I want to execute spark jobs from my local machine how to do that.

What I mean is there a way to give IP address, port all the details to connect a master(YARN)
on some other network from my local spark Program.


View raw message