Never mind. Instead of set property in the profile 


I have to change the property hadoop.version from 2.2.0 to 2.5.0-cdh5.3.2 in spark-parent's pom.xml. Otherwise, maven will resolve transitive dependencies using the default version 2.2.0.

On Fri, Aug 7, 2015 at 8:45 PM, Benyi Wang <> wrote:
I'm trying to build spark 1.4.1 against CDH 5.3.2. I created a profile called cdh5.3.2 in spark_parent.pom, made some changes for sql/hive/v0.13.1, and the build finished successfully.

Here is my problem:
  • If I run `mvn -Pcdh5.3.2,yarn,hive install`, the artifacts are installed into my local repo.
  • I expected `hadoop-client` version should be `hadoop-client-2.5.0-cdh5.3.2`, but it actually `hadoop-client-2.2.0`.
If I add a dependency of `spark-sql-1.2.0-cdh5.3.2`, the version is `hadoop-client-2.5.0-cdh5.3.2`.

What's the trick behind it?