spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Krishnanand Khambadkone <>
Subject Re: Cannot start spark master
Date Mon, 06 Jan 2014 22:17:35 GMT
Mark,  Thank you for your prompt response.  I did follow the instructions, removed target
and rebuilt (sbt assembly) spark. Now i am able to start start master.   The instructions
say however that it is supposed to publish the port and url using which the slave can be started
but I am not able to see that.    What is the default port on which the slave is started
and what is the command for the same.  Also,  how many daemons are needed for a standalone
spark instance running on mac os x.

On Sunday, January 5, 2014 8:13 PM, Mark Hamstra <> wrote:
So follow the instruction and remove the extra spark-assembly jar from <spark>/assembly/target.
 Or remove all of <spark>/assembly/target and do `./sbt/sbt assembly/assembly`, or
do `./sbt/sbt clean` before redoing `./sbt/sbt assembly`.  In any case, you've got an extra
assembly jar left over from a prior build that you did not clean before building the new assembly.

On Sun, Jan 5, 2014 at 7:37 PM, danoomistmatiste <> wrote:

Hi,  I have installed and built spark-0.8.1-incubating-bin-cdh4 with sbt/sbt
>assembly.  I am running this with scala 2.9.3.  When i try to start spark
>master (./,  I get this error message.
>failed to launch org.apache.spark.deploy.master.Master:
>  spark-assembly_2.9.3-0.8.1-incubating-hadoop2.0.0-mr1-cdh4.2.0.jar
>  Please remove all but one jar.
>full log in
>View this message in context:
>Sent from the Apache Spark User List mailing list archive at
View raw message