spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Adrian Tanase <atan...@adobe.com>
Subject Re: Building with SBT and Scala 2.11
Date Wed, 14 Oct 2015 17:09:29 GMT
You are correct, of course. Gave up on sbt for spark long ago, I never managed to get it working
while mvn works great.

Sent from my iPhone

On 14 Oct 2015, at 16:52, Ted Yu <yuzhihong@gmail.com<mailto:yuzhihong@gmail.com>>
wrote:

Adrian:
Likely you were using maven.

Jakob's report was with sbt.

Cheers

On Tue, Oct 13, 2015 at 10:05 PM, Adrian Tanase <atanase@adobe.com<mailto:atanase@adobe.com>>
wrote:
Do you mean hadoop-2.4 or 2.6? not sure if this is the issue but I'm also compiling the 1.5.1
version with scala 2.11 and hadoop 2.6 and it works.

-adrian

Sent from my iPhone

On 14 Oct 2015, at 03:53, Jakob Odersky <jodersky@gmail.com<mailto:jodersky@gmail.com>>
wrote:

I'm having trouble compiling Spark with SBT for Scala 2.11. The command I use is:

    dev/change-version-to-2.11.sh<http://change-version-to-2.11.sh>
    build/sbt -Pyarn -Phadoop-2.11 -Dscala-2.11

followed by

    compile

in the sbt shell.

The error I get specifically is:

spark/core/src/main/scala/org/apache/spark/rpc/netty/NettyRpcEnv.scala:308: no valid targets
for annotation on value conf - it is discarded unused. You may specify targets with meta-annotations,
e.g. @(transient @param)
[error] private[netty] class NettyRpcEndpointRef(@transient conf: SparkConf)
[error]

However I am also getting a large amount of deprecation warnings, making me wonder if I am
supplying some incompatible/unsupported options to sbt. I am using Java 1.8 and the latest
Spark master sources.
Does someone know if I am doing anything wrong or is the sbt build broken?

thanks for you help,
--Jakob



Mime
View raw message