spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ajay Nair <>
Subject Apache Spark running out of the spark shell
Date Sat, 03 May 2014 15:42:53 GMT

I have written a code that works just about fine in the spark shell on EC2.
The ec2 script helped me configure my master and worker nodes. Now I want to
run the scala-spark code out side the interactive shell. How do I go about
doing it.

I was referring to the instructions mentioned here:

But this is confusing because it mentions about a simple project jar file
which I am not sure how to generate. I only have the file that runs directly
on my spark shell. Any easy intruction to get this quickly running as a job?


View this message in context:
Sent from the Apache Spark Developers List mailing list archive at

View raw message