spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adrian Tanase <atan...@adobe.com>
Subject Re: Building with SBT and Scala 2.11
Date Wed, 14 Oct 2015 05:05:53 GMT
Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also compiling the 1.5.1
version with scala 2.11 and hadoop 2.6 and it works.

-adrian

Sent from my iPhone

On 14 Oct 2015, at 03:53, Jakob Odersky <jodersky@gmail.com<mailto:jodersky@gmail.com>>
wrote:

I'm having trouble compiling Spark with SBT for Scala 2.11. The command I use is:

    dev/change-version-to-2.11.sh<http://change-version-to-2.11.sh>
    build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11

followed by

    compile

in the sbt shell.

The error I get specifically is:

spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308: no valid targets
for annotation on value conf - it is discarded unused. You may specify targets with meta-annotations,
e.g. @(transient @param)
[error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
[error]

However I am also getting a large amount of deprecation warnings, making me wonder if I am
supplying some incompatible/unsupported options to sbt. I am using Java 1.8 and the latest
Spark master sources.
Does someone know if I am doing anything wrong or is the sbt build broken?

thanks for you help,
--Jakob


Mime
View raw message