From: Saif.A.Ellafi@wellsfargo.com <Saif.A.Ellafi@wellsfargo.com>
Sent: Monday, November 21, 2016 2:04:06 PM
Subject: Cluster deploy mode driver location
I have a Spark program in 1.6.1, however, when I submit it to cluster, it randomly picks the driver.
I know there is a driver specification option, but along with it it is mandatory to define many other options I am not familiar with. The trouble is, the .jars I am launching need to be available at the driver host, and I would like to have this jars in
just a specific host, which I like it to be the driver.