spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Naveen Kumar Pokala <npok...@spcapitaliq.com>
Subject Execute Spark programs from local machine on Yarn-hadoop cluster
Date Fri, 21 Nov 2014 14:39:04 GMT
Hi,

I am executing my spark jobs on yarn cluster by forming conf object in the following way.

SparkConf conf = new SparkConf().setAppName("NewJob").setMaster("yarn-cluster");

Now I want to execute spark jobs from my local machine how to do that.

What I mean is there a way to give IP address, port all the details to connect a master(YARN)
on some other network from my local spark Program.

-Naveen

Mime
View raw message