spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nicolas Garneau <ngarn...@ngarneau.com>
Subject Re: Apache Spark running out of the spark shell
Date Sun, 04 May 2014 23:24:38 GMT
Hey AJ,

I have tried to run on a cluster yet, only on local mode.
I'll try to get something running on a cluster soon and keep you posted.

Nicolas Garneau

> On May 4, 2014, at 6:23 PM, Ajay Nair <prodigyaj@gmail.com> wrote:
> 
> Now I got it to work .. well almost. However I needed to copy the project/
> folder to the spark-standalone folder as the package build was failing
> because it could not find buil properties. After the copy the build was
> successful. However when I run it I get errors but it still gives me the
> output.
> 
> [error] 14/05/04 21:58:19 INFO spark.SparkContext: Job finished: count at
> SimpleApp.scala:11, took 0.040651597 s
> [error] 14/05/04 21:58:19 INFO scheduler.TaskSetManager: Finished TID 3 in
> 17 ms on localhost (progress: 2/2)
> [info] Lines with a: 3, Lines with b: 2
> [error] 14/05/04 21:58:19 INFO scheduler.TaskSchedulerImpl: Removed TaskSet
> 1.0, whose tasks have all completed, from pool 
> [success] Total time: 5 s, completed May 4, 2014 9:58:20 PM
> 
> 
> You can see the [info] that contains the output. All the lines i get mention
> [errors], any reason why ?
> 
> I have configured my ec2 machines master and slave nodes and this code I
> think tries to run in the local mode.
> 
> 
> 
> --
> View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6478.html
> Sent from the Apache Spark Developers List mailing list archive at Nabble.com.
> 

Mime
View raw message