spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Patrick Wendell <pwend...@gmail.com>
Subject Re: Development version error on sbt compile publish-local
Date Sun, 12 Jan 2014 00:31:50 GMT
Can you try running "sbt/sbt clean". Sometimes things can get randomly
corrupted and cause stuff like this.

On Sat, Jan 11, 2014 at 12:49 PM, Shing Hing Man <matmsh@yahoo.com> wrote:
>
>
>
>
>  Hi,
>    I have checkouted  the  development version of Spark at
>           git://github.com/apache/incubator-spark.git.
>
> I have trying to compile it with Scala 2.10.3.
>
> The following command completed successfully.
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> SPARK_HADOOP_VERSION=1.2.1
sbt/sbt assembly
> But
>
> matmsh@gauss:~/Downloads/spark/github/incubator-spark> sbt compile publish-local
>
> gives the following error:
>
>
>
> [info] Compiling 1 Scala source to /home/matmsh/Downloads/spark/github/incubator-spark/repl/target/scala-2.10/classes...
> [info] Compiling 8 Scala sources to /home/matmsh/Downloads/spark/github/incubator-spark/streaming/target/scala-2.10/classes...
> [error] /home/matmsh/Downloads/spark/github/incubator-spark/streaming/src/main/scala/org/apache/spark/streaming/api/java/JavaPairDStream.scala:52:
type mismatch;
> [error]  found   : org.apache.spark.streaming.DStream[(K, V)]
> [error]  required: org.apache.spark.streaming.api.java.JavaPairDStream[K,V]
> [error]  Note: implicit method fromPairDStream is not applicable here because it comes
after the application point and it lacks an explicit result type
> [error]     dstream.filter((x => f(x).booleanValue()))
>
>
> Any there anyway to resolve the above issue ?
>
> Thanks  in advance for your assistance !
>
>
> Shing

Mime
View raw message