spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Anahita Talebi <anahita.t.am...@gmail.com>
Subject Fwd: Entering the variables in the Argument part in Submit job section to run a spark code on Google Cloud
Date Mon, 09 Jan 2017 12:43:39 GMT
Dear friends,

I am trying to run a run a spark code on Google cloud using submit job.
https://cloud.google.com/dataproc/docs/tutorials/spark-scala

My question is about the part "argument".
In my spark code, they are some variables that their values are defined in
a shell file (.sh), as following:

--trainFile=small_train.dat \
--testFile=small_test.dat \
--numFeatures=9947 \
--numRounds=100 \


- I have tried to enter only the values and each value in a separate box as
following but it is not working:

data/small_train.dat
data/small_test.dat
9947
100

I have also tried to give the parameters like in this below, but it is not
working neither:
trainFile=small_train.dat
testFile=small_test.dat
numFeatures=9947
numRounds=100

I added the files small_train.dat and small_test.dat in the same bucket
where I saved the .jar file. Let's say if my bucket is named Anahita, I
added spark.jar, small_train.dat and small_test.dat in the bucket "Anahita".


Does anyone know, how I can enter these values in the argument part?

Thanks in advance,
Anahita

Mime
View raw message