spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nicolas Garneau <ngarn...@ngarneau.com>
Subject Re: Apache Spark running out of the spark shell
Date Sat, 03 May 2014 17:28:48 GMT
Sorry, the link went wrong. I meant here:
https://github.com/ngarneau/spark-standalone

Le 2014-05-03 à 13:23, Nicolas Garneau <ngarneau@ngarneau.com> a écrit :

> Hey AJ,
> 
> I created a little sample app using the spark's quick start.
> Have a look here.
> Assuming you used scala, using sbt is good for running your application in standalone
mode.
> The configuration file which is "simple.sbt" in my repo, holds all the dependencies needed
to build your app.
> 
> Hope this helps!
> 
> Le 2014-05-03 à 11:42, Ajay Nair <prodigyaj@gmail.com> a écrit :
> 
>> Hi,
>> 
>> I have written a code that works just about fine in the spark shell on EC2.
>> The ec2 script helped me configure my master and worker nodes. Now I want to
>> run the scala-spark code out side the interactive shell. How do I go about
>> doing it.
>> 
>> I was referring to the instructions mentioned here:
>> https://spark.apache.org/docs/0.9.1/quick-start.html
>> 
>> But this is confusing because it mentions about a simple project jar file
>> which I am not sure how to generate. I only have the file that runs directly
>> on my spark shell. Any easy intruction to get this quickly running as a job?
>> 
>> Thanks
>> AJ
>> 
>> 
>> 
>> --
>> View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459.html
>> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
>> 
> 
> Nicolas Garneau
> ngarneau@ngarneau.com
> 

Nicolas Garneau
418.569.3097
ngarneau@ngarneau.com


Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message