spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Steve Loughran <>
Subject Re: Building Spark with Custom Hadoop Version
Date Fri, 05 Feb 2016 10:18:05 GMT

> On 4 Feb 2016, at 23:11, Ted Yu <> wrote:
> Assuming your change is based on hadoop-2 branch, you can use 'mvn install' command which
would put artifacts under 2.8.0-SNAPSHOT subdir in your local maven repo.

+ generally, unless you want to run all the hadoop tests, set the  -DskipTests on the mvn
commands. The HDFS ones take a while and can use up all your file handles.

mvn install -DskipTests

here's the aliases I use

export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m -Xms256m -Djava.awt.headless=true"
alias mi="mvn install -DskipTests"
alias mci="mvn clean install -DskipTests"
alias mvt="mvn test"
alias mvct="mvn clean test"
alias mvp="mvn package -DskipTests"
alias mvcp="mvn clean package -DskipTests"
alias mvnsite="mvn site:site -Dmaven.javadoc.skip=true -DskipTests"
alias mvndep="mvn dependency:tree -Dverbose"

mvndep > target/dependencies.txt is my command of choice to start working out where some
random dependency is coming in from

To unsubscribe, e-mail:
For additional commands, e-mail:

View raw message