spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sidney Feiner <sidney.fei...@startapp.com>
Subject RE: Jars directory in Spark 2.0
Date Wed, 01 Feb 2017 21:54:52 GMT
Ok, good to know ☺
Shading every spark app it is then…
Thanks!

Sidney Feiner   /  SW Developer
M: +972.528197720  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Marcelo Vanzin [mailto:vanzin@cloudera.com]
Sent: Wednesday, February 1, 2017 7:41 PM
To: Sidney Feiner <sidney.feiner@startapp.com>
Cc: Koert Kuipers <koert@tresata.com>; user@spark.apache.org
Subject: Re: Jars directory in Spark 2.0

Spark has never shaded dependencies (in the sense of renaming the classes), with a couple
of exceptions (Guava and Jetty). So that behavior is nothing new. Spark's dependencies themselves
have a lot of other dependencies, so doing that would have limited benefits anyway.

On Tue, Jan 31, 2017 at 11:23 PM, Sidney Feiner <sidney.feiner@startapp.com<mailto:sidney.feiner@startapp.com>>
wrote:
Is this done on purpose? Because it really makes it hard to deploy applications. Is there
a reason they didn't shade the jars they use to begin with?

Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

From: Koert Kuipers [mailto:koert@tresata.com<mailto:koert@tresata.com>]
Sent: Tuesday, January 31, 2017 7:26 PM
To: Sidney Feiner <sidney.feiner@startapp.com<mailto:sidney.feiner@startapp.com>>
Cc: user@spark.apache.org<mailto:user@spark.apache.org>
Subject: Re: Jars directory in Spark 2.0

you basically have to keep your versions of dependencies in line with sparks or shade your
own dependencies.

you cannot just replace the jars in sparks jars folder. if you wan to update them you have
to build spark yourself with updated dependencies and confirm it compiles, passes tests etc.

On Tue, Jan 31, 2017 at 3:40 AM, Sidney Feiner <sidney.feiner@startapp.com<mailto:sidney.feiner@startapp.com>>
wrote:
Hey,
While migrating to Spark 2.X from 1.6, I've had many issues with jars that come preloaded
with Spark in the "jars/" directory and I had to shade most of my packages.
Can I replace the jars in this folder to more up to date versions? Are those jar used for
anything internal in Spark which means I can't blindly replace them?

Thanks ☺


Sidney Feiner   /  SW Developer
M: +972.528197720<tel:+972%2052-819-7720>  /  Skype: sidney.feiner.startapp

[StartApp]<http://www.startapp.com/>

 <http://www.startapp.com/press/#events_press>



--
Marcelo
Mime
View raw message