spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Ulanov, Alexander" <alexander.ula...@hp.com>
Subject Spark maven project with the latest Spark jars
Date Tue, 05 Aug 2014 16:04:20 GMT
Hi,

I'm trying to create a maven project that references the latest build of Spark.
1)downloaded sources and compiled the latest version of Spark.
2)added new spark-core jar to the a new local maven repo
3)created Scala maven project with net.alchim31.maven (scala-archetype-simple v 1.5)
4)added dependency to the new spark-core inside the pom.xml
5)I create SparkContext in the code of this project: val sc = new SparkContext("local", "test")
6)When I run it, I get the error:
Error:scalac: bad symbolic reference. A signature in RDD.class refers to term io
in package org.apache.hadoop which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RDD.class.

This problem doesn't occur if I reference the spark-core from the maven repo. What am I doing
wrong?

Best regards, Alexander

Mime
  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message