spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Reynold Xin <>
Subject Akka problem when using scala command to launch Spark applications in the current 0.9.0-SNAPSHOT
Date Sat, 21 Dec 2013 07:36:28 GMT
It took me hours to debug a problem yesterday on the latest master branch
(0.9.0-SNAPSHOT), and I would like to share with the dev list in case
anybody runs into this Akka problem.

A little background for those of you who haven't followed closely the
development of Spark and YARN 2.2: YARN 2.2 uses protobuf 2.5, and Akka
uses an older version of protobuf that is not binary compatible. In order
to have a single build that is compatible for both YARN 2.2 and pre-2.2
YARN/Hadoop, we published a special version of Akka that builds with
protobuf shaded (i.e. using a different package name for the protobuf

However, it turned out Scala 2.10 includes a version of Akka jar in its
default classpath (look at the lib folder in Scala 2.10 binary
distribution). If you use the scala command to launch any Spark application
on the current master branch, there is a pretty high chance that you
wouldn't be able to create the SparkContext (stack trace at the end of the
email). The problem is that the Akka packaged with Scala 2.10 takes
precedence in the classloader over the special Akka version Spark includes.

Before we have a good solution for this, the workaround is to use java to
launch the application instead of scala. All you need to do is to include
the right Scala jars (scala-library and scala-compiler) in the classpath.
Note that the scala command is really just a simple script that calls java
with the right classpath.

Stack trace:

akka.remote.RemoteActorRefProvider.<init>(java.lang.String,$Settings, akka.event.EventStream,,
at java.lang.Class.getConstructor0(
at java.lang.Class.getDeclaredConstructor(
at scala.util.Try$.apply(Try.scala:161)
at scala.util.Success.flatMap(Try.scala:200)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:79)
at org.apache.spark.SparkEnv$.createFromSystemProperties(SparkEnv.scala:120)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:106)

  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message