spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Silvio Fiorito <silvio.fior...@granturing.com>
Subject Re: Cluster deploy mode driver location
Date Tue, 22 Nov 2016 13:02:29 GMT
Hi Saif!


Unfortunately, I don't think this is possible for YARN driver-cluster mode. Regarding the
JARs you're referring to, can you place them on HDFS so you can then have them in a central
location and refer to them that way for dependencies?


http://spark.apache.org/docs/latest/submitting-applications.html#advanced-dependency-management


Thanks,

Silvio

________________________________
From: Saif.A.Ellafi@wellsfargo.com <Saif.A.Ellafi@wellsfargo.com>
Sent: Monday, November 21, 2016 2:04:06 PM
To: user@spark.apache.org
Subject: Cluster deploy mode driver location

Hello there,

I have a Spark program in 1.6.1, however, when I submit it to cluster, it randomly picks the
driver.

I know there is a driver specification option, but along with it it is mandatory to define
many other options I am not familiar with. The trouble is, the .jars I am launching need to
be available at the driver host, and I would like to have this jars in just a specific host,
which I like it to be the driver.

Any help?

Thanks!
Saif


Mime
View raw message