spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Matei Zaharia <>
Subject Re: correct upgrade process
Date Fri, 01 Aug 2014 19:15:15 GMT
This should be okay, but make sure that your cluster also has the right code deployed. Maybe
you have the wrong one.

If you built Spark from source multiple times, you may also want to try sbt clean before sbt


On August 1, 2014 at 12:00:07 PM, SK ( wrote:


I upgraded to 1.0.1 from 1.0 a couple of weeks ago and have been able to use 
some of the features advertised in 1.0.1. However, I get some compilation 
errors in some cases and based on user response, these errors have been 
addressed in the 1.0.1 version and so I should not be getting these errors. 
So I want to make sure I followed the correct upgrade process as below (I am 
running Spark on single machine in standalone mode - so no cluster 

- set SPARK_HOME to the new version 

- run "sbt assembly" in SPARK_HOME to build the new Spark jars 

- in the project sbt file point the libraryDependencies for spark-core and 
other libraries to the 1.0.1 version and run "sbt assembly" to build the 
project jar. 

Is there anything else I need to do to ensure that no old jars are being 
used? For example do I need to manually delete any old jars? 


View this message in context:

Sent from the Apache Spark User List mailing list archive at 

View raw message