spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <>
Subject Re: Using Spark as a Maven dependency but with Hadoop 2.6
Date Fri, 30 Sep 2016 10:12:07 GMT

On 29 Sep 2016, at 10:37, Olivier Girardot <<>>

I know that the code itself would not be the same, but it would be useful to at least have
the pom/build.sbt transitive dependencies different when fetching the artifact with a specific
classifier, don't you think ?
For now I've overriden them myself using the dependency versions defined in the pom.xml of
So it's not a blocker issue, it may be useful to document it, but a blog post would be sufficient
I think.

The problem here is that it's not directly something that maven repo is set up to deal with.
What could be done would be to publish multiple pom-only artifacts, spark-scala-2.11-hadoop-2.6.pom
which would declare the transitive stuff appropriately for the right version. You wouldn't
need to actually rebuild everything, just declare a dependency on the spark 2.2 artifacts
excluding all of hadoop 2.2, pulling in 2.6.

This wouldn't even need to be an org.apache.spark artifact, just something any can build and
publish under their own name.


View raw message