spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Soumya Simanta <soumya.sima...@gmail.com>
Subject Re: Proper way to create standalone app with custom Spark version
Date Fri, 16 May 2014 23:06:09 GMT
Install your custom spark jar to your local maven or ivy repo. Use this custom jar in your
pom/sbt file. 



> On May 15, 2014, at 3:28 AM, Andrei <faithlessfriend@gmail.com> wrote:
> 
> (Sorry if you have already seen this message - it seems like there were some issues delivering
messages to the list yesterday)
> 
> We can create standalone Spark application by simply adding "spark-core_2.x" to build.sbt/pom.xml
and connecting it to Spark master. 
> 
> We can also build custom version of Spark (e.g. compiled against Hadoop 2.x) from source
and deploy it to cluster manually. 
> 
> But what is a proper way to use _custom version_ of Spark in _standalone application_?

> 
> 
> I'm currently trying to deploy custom version to local Maven repository and add it to
SBT project. Another option is to add Spark as local jar to every project. But both of these
ways look overcomplicated and in general wrong. 
> 
> So what is the implied way to do it? 
> 
> Thanks, 
> Andrei

Mime
View raw message