spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sandy Ryza <sandy.r...@cloudera.com>
Subject Re: Apache Spark running out of the spark shell
Date Sat, 03 May 2014 17:06:31 GMT
Hi AJ,

You might find this helpful -
http://blog.cloudera.com/blog/2014/04/how-to-run-a-simple-apache-spark-app-in-cdh-5/

-Sandy


On Sat, May 3, 2014 at 8:42 AM, Ajay Nair <prodigyaj@gmail.com> wrote:

> Hi,
>
> I have written a code that works just about fine in the spark shell on EC2.
> The ec2 script helped me configure my master and worker nodes. Now I want
> to
> run the scala-spark code out side the interactive shell. How do I go about
> doing it.
>
> I was referring to the instructions mentioned here:
> https://spark.apache.org/docs/0.9.1/quick-start.html
>
> But this is confusing because it mentions about a simple project jar file
> which I am not sure how to generate. I only have the file that runs
> directly
> on my spark shell. Any easy intruction to get this quickly running as a
> job?
>
> Thanks
> AJ
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message