spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ajay Nair <prodig...@gmail.com>
Subject Re: Apache Spark running out of the spark shell
Date Sat, 03 May 2014 17:38:51 GMT
Quick question, where should I place your folder. Inside the spark directory.
My Spark directory is in /root/spark
So currently I tried pulling your github code in /root/spark/spark-examples
and modified my home spark directory in the scala code.
I copied the sbt folder within the spark-examples folder. But when I try
running this command

$root/spark/spark-examples: sbt/sbt package

awk: cmd. line:1: fatal: cannot open file `./project/build.properties' for
reading (No such file or directory)
Launching sbt from sbt/sbt-launch-.jar
Error: Invalid or corrupt jarfile sbt/sbt-launch-.jar


However the sbt package runs fines (Expectedly) when i run it from
/root/spark folder.

Anything I am doing wrong here?





--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Apache-Spark-running-out-of-the-spark-shell-tp6459p6465.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

Mime
View raw message