spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Eugen Cepoi <>
Subject Re: Packaging a spark job using maven
Date Mon, 19 May 2014 10:15:06 GMT
2014-05-19 10:35 GMT+02:00 Laurent T <>:

> Hi Eugen,
> Thanks for your help. I'm not familiar with the shaded plugin and i was
> wondering: does it replace the assembly plugin ?

Nope it doesn't replace it. It allows you to make "fat jars" and other nice
things such as relocating classes to some other package.

I am using it in combination with assembly and jdeb to build deployable
archives (zip and debian). I find that building fat jars with shade plugin
is more powerful and easier that with assembly.

> Also, do i have to specify
> all the artifacts and sub artifacts in the artifactSet ? Or can i just use
> a
> *:* wildcard and let the maven scopes do their work ? I have a lot of
> overlap warnings when i do so.

Indeed you don't have to tell exactly what must be included, I do so, in
order to have at the end a small archive that we can quickly deploy. Have a
look at the doc you have some examples

In short, you remove the includes and instead write the excludes (spark,
hadoop, etc). The overlap is due to same classes being present in different
jars. You can exclude those jars to remove the warnings.


> Thanks for your help.
> Regards,
> Laurent
> --
> View this message in context:
> Sent from the Apache Spark User List mailing list archive at

View raw message