spark-issues mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Markus Dale (JIRA)" <j...@apache.org>
Subject [jira] [Updated] (SPARK-5584) Add Maven Enforcer Plugin dependencyConvergence rule (fail false)
Date Wed, 04 Feb 2015 04:49:34 GMT

     [ https://issues.apache.org/jira/browse/SPARK-5584?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]

Markus Dale updated SPARK-5584:
-------------------------------
    Description: 
The Spark Maven build uses the Maven Enforcer plugin but does not have a rule for dependencyConvergence
(no version conflicts between dependencies/transitive dependencies). 

Putting this in the current 1.3.0-SNAPSHOT in main pom.xml by adding dependencyConvergence
rule:

{noformat}
        <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-enforcer-plugin</artifactId>
          <version>1.3.1</version>
          <executions>
            <execution>
              <id>enforce-versions</id>
              <goals>
                <goal>enforce</goal>
              </goals>
              <configuration>
                <rules>
                  <requireMavenVersion>
                    <version>3.0.4</version>
                  </requireMavenVersion>
                  <requireJavaVersion>
                    <version>${java.version}</version>
                  </requireJavaVersion>
                  <dependencyConvergence/>
                </rules>
              </configuration>
            </execution>
          </executions>
        </plugin>
{noformat}

And running with:
mvn -Pyarn -Phadoop-2.4 -Phive -DskipTests clean package -Denforcer.fail=false &> output.txt

identified a lot of dependency convergence problems (one of them re-opening SPARK-3039 and
fixed via exclude transitive dependency/explicit include of desired version of library).

Many convergence errors like:

Dependency convergence error for com.thoughtworks.paranamer:paranamer:2.3 paths to dependency
are:
+-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
  +-org.apache.hadoop:hadoop-client:2.4.0
    +-org.apache.hadoop:hadoop-common:2.4.0
      +-org.apache.avro:avro:1.7.6
        +-com.thoughtworks.paranamer:paranamer:2.3
and
+-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
  +-org.json4s:json4s-jackson_2.10:3.2.10
    +-org.json4s:json4s-core_2.10:3.2.10
      +-com.thoughtworks.paranamer:paranamer:2.6

[WARNING] 
Dependency convergence error for io.netty:netty:3.8.0.Final paths to dependency are:
+-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
  +-org.spark-project.akka:akka-remote_2.10:2.3.4-spark
    +-io.netty:netty:3.8.0.Final
and
+-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
  +-org.seleniumhq.selenium:selenium-java:2.42.2
    +-org.webbitserver:webbit:0.4.14
      +-io.netty:netty:3.5.2.Final


  was:
The Spark Maven build uses the Maven Enforcer plugin but does not have a rule for dependencyConvergence
(no version conflicts between dependencies/transitive dependencies). 

Putting this in the current 1.3.0-SNAPSHOT in main pom.xml by adding dependencyConvergence
rule:

{noformat}
        <plugin>
          <groupId>org.apache.maven.plugins</groupId>
          <artifactId>maven-enforcer-plugin</artifactId>
          <version>1.3.1</version>
          <executions>
            <execution>
              <id>enforce-versions</id>
              <goals>
                <goal>enforce</goal>
              </goals>
              <configuration>
                <rules>
                  <requireMavenVersion>
                    <version>3.0.4</version>
                  </requireMavenVersion>
                  <requireJavaVersion>
                    <version>${java.version}</version>
                  </requireJavaVersion>
                  <dependencyConvergence/>
                </rules>
              </configuration>
            </execution>
          </executions>
        </plugin>
{noformat}

And running with:
mvn -Pyarn -Phadoop-2.4 -Phive -DskipTests clean package -Denforcer.fail=false &> output.txt

identified a lot of dependency convergence problems (one of them re-opening SPARK-3039 and
fixed via exclude transitive dependency/explicit include of desired version of library).


> Add Maven Enforcer Plugin dependencyConvergence rule (fail false)
> -----------------------------------------------------------------
>
>                 Key: SPARK-5584
>                 URL: https://issues.apache.org/jira/browse/SPARK-5584
>             Project: Spark
>          Issue Type: Improvement
>          Components: Build
>    Affects Versions: 1.2.0
>            Reporter: Markus Dale
>            Priority: Minor
>
> The Spark Maven build uses the Maven Enforcer plugin but does not have a rule for dependencyConvergence
(no version conflicts between dependencies/transitive dependencies). 
> Putting this in the current 1.3.0-SNAPSHOT in main pom.xml by adding dependencyConvergence
rule:
> {noformat}
>         <plugin>
>           <groupId>org.apache.maven.plugins</groupId>
>           <artifactId>maven-enforcer-plugin</artifactId>
>           <version>1.3.1</version>
>           <executions>
>             <execution>
>               <id>enforce-versions</id>
>               <goals>
>                 <goal>enforce</goal>
>               </goals>
>               <configuration>
>                 <rules>
>                   <requireMavenVersion>
>                     <version>3.0.4</version>
>                   </requireMavenVersion>
>                   <requireJavaVersion>
>                     <version>${java.version}</version>
>                   </requireJavaVersion>
>                   <dependencyConvergence/>
>                 </rules>
>               </configuration>
>             </execution>
>           </executions>
>         </plugin>
> {noformat}
> And running with:
> mvn -Pyarn -Phadoop-2.4 -Phive -DskipTests clean package -Denforcer.fail=false &>
output.txt
> identified a lot of dependency convergence problems (one of them re-opening SPARK-3039
and fixed via exclude transitive dependency/explicit include of desired version of library).
> Many convergence errors like:
> Dependency convergence error for com.thoughtworks.paranamer:paranamer:2.3 paths to dependency
are:
> +-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
>   +-org.apache.hadoop:hadoop-client:2.4.0
>     +-org.apache.hadoop:hadoop-common:2.4.0
>       +-org.apache.avro:avro:1.7.6
>         +-com.thoughtworks.paranamer:paranamer:2.3
> and
> +-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
>   +-org.json4s:json4s-jackson_2.10:3.2.10
>     +-org.json4s:json4s-core_2.10:3.2.10
>       +-com.thoughtworks.paranamer:paranamer:2.6
> [WARNING] 
> Dependency convergence error for io.netty:netty:3.8.0.Final paths to dependency are:
> +-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
>   +-org.spark-project.akka:akka-remote_2.10:2.3.4-spark
>     +-io.netty:netty:3.8.0.Final
> and
> +-org.apache.spark:spark-core_2.10:1.3.0-SNAPSHOT
>   +-org.seleniumhq.selenium:selenium-java:2.42.2
>     +-org.webbitserver:webbit:0.4.14
>       +-io.netty:netty:3.5.2.Final



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org


Mime
View raw message