spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Andrei <>
Subject Re: Is uberjar a recommended way of running Spark/Scala applications?
Date Thu, 29 May 2014 17:13:06 GMT
Thanks, Jordi, your gist looks pretty much like what I have in my project
currently (with few exceptions that I'm going to borrow).

I like the idea of using "sbt package", since it doesn't require third
party plugins and, most important, doesn't create a mess of classes and
resources. But in this case I'll have to handle jar list manually via Spark
context. Is there a way to automate this process? E.g. when I was a Clojure
guy, I could run "lein deps" (lein is a build tool similar to sbt) to
download all dependencies and then just enumerate them from my app. Maybe
you have heard of something like that for Spark/SBT?


On Thu, May 29, 2014 at 3:48 PM, jaranda <> wrote:

> Hi Andrei,
> I think the preferred way to deploy Spark jobs is by using the sbt package
> task instead of using the sbt assembly plugin. In any case, as you comment,
> the mergeStrategy in combination with some dependency exlusions should fix
> your problems. Have a look at  this gist
> <>   for further
> details (I just followed some recommendations commented in the sbt assembly
> plugin documentation).
> Up to now I haven't found a proper way to combine my development/deployment
> phases, although I must say my experience in Spark is pretty poor (it
> really
> depends in your deployment requirements as well). In this case, I think
> someone else could give you some further insights.
> Best,
> --
> View this message in context:
> Sent from the Apache Spark User List mailing list archive at

View raw message