spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Saisai Shao <>
Subject Re: Submitting with --deploy-mode cluster: uploading the jar
Date Thu, 01 Oct 2015 00:51:45 GMT
Are you running on standalone deploy mode, what Spark version are you

Can you explain a little more specifically what exception occurs, how to
provide the jar to Spark?

I tried in my local machine with command:

./bin/spark-submit --verbose --master spark://hw12100.local:7077
--deploy-mode cluster --class org.apache.spark.examples.SparkPi

Seems Spark will upload this examples jar automatically, don't need to
handle it manually.


On Thu, Oct 1, 2015 at 8:36 AM, Christophe Schmitz <>

> Hi Saisai
> I am using this command:
> spark-submit --deploy-mode cluster --properties-file file.conf --class
> myclass test-assembly-1.0.jar
> The application start only if I manually copy test-assembly-1.0.jar in all
> the worer (or the master, I don't remember) and provide the full path of
> the file.
> On the other hand with --deploy-mode client I don't need to do that, but
> then I need to accept incoming connection in my client to serve the jar
> (which is not possible behind a firewall I don't control)
> Thanks,
> Christophe
> On Wed, Sep 30, 2015 at 5:19 PM, Saisai Shao <>
> wrote:
>> As I remembered you don't need to upload application jar manually, Spark
>> will do it for you when you use Spark submit. Would you mind posting out
>> your command of Spark submit?
>> On Wed, Sep 30, 2015 at 3:13 PM, Christophe Schmitz <>
>> wrote:
>>> Hi there,
>>> I am trying to use the "--deploy-mode cluster" option to submit my job
>>> (spark 1.4.1). When I do that, the spark-driver (on the cluster) is looking
>>> for my application jar. I can manually copy my application jar on all the
>>> workers, but I was wondering if there is a way to submit the application
>>> jar when running spark-submit.
>>> Thanks!

View raw message