Compiling from source with Scala 2.11 support fixed this issue. Thanks again for the help!

On Tue, Sep 8, 2015 at 7:33 AM, Gheorghe Postelnicu <> wrote:
Good point. It is a pre-compiled Spark version. Based on the text on the downloads page, the answer to your question is no, so I will download the sources and recompile.


On Tue, Sep 8, 2015 at 5:17 AM, Koert Kuipers <> wrote:
is /opt/spark-1.4.1-bin-hadoop2.6 a spark version compiled with scala 2.11?

On Mon, Sep 7, 2015 at 5:29 PM, Gheorghe Postelnicu <> wrote:

sbt assembly; $SPARK_HOME/bin/spark-submit --class main.scala.TestMain --master "local[4]" target/scala-2.11/bof-assembly-0.1-SNAPSHOT.jar 

using Spark:


On Mon, Sep 7, 2015 at 10:20 PM, Jonathan Coveney <> wrote:
How are you building and running it?

El lunes, 7 de septiembre de 2015, Gheorghe Postelnicu <> escribió:
Interesting idea. Tried that, didn't work. Here is my new SBT file:

name := """testMain"""

scalaVersion := "2.11.6"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "1.4.1" % "provided",
  "org.apache.spark" %% "spark-sql" % "1.4.1" % "provided",
  "org.scala-lang" % "scala-reflect" % "2.11.6"

On Mon, Sep 7, 2015 at 9:55 PM, Jonathan Coveney <> wrote:
Try adding the following to your build.sbt

libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.6"

I believe that spark shades the scala library, and this is a library that it looks like you need in an unshaded way.

2015-09-07 16:48 GMT-04:00 Gheorghe Postelnicu <>:

The following code fails when compiled from SBT:

package main.scala

import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext

object TestMain {
  def main(args: Array[String]): Unit = {
    implicit val sparkContext = new SparkContext()
    val sqlContext = new SQLContext(sparkContext)
    import sqlContext.implicits._
    sparkContext.parallelize(1 to 10).map(i => (i, i.toString)).toDF("intCol", "strCol")

with the following error:

15/09/07 21:39:21 INFO BlockManagerMaster: Registered BlockManager
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
at main.scala.Bof$.main(Bof.scala:14)
at main.scala.Bof.main(Bof.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(
at sun.reflect.DelegatingMethodAccessorImpl.invoke(
at java.lang.reflect.Method.invoke(
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:665)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:170)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:193)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:112)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
15/09/07 21:39:22 INFO SparkContext: Invoking stop() from shutdown hook

whereas the code above works in a spark shell.

The code is compiled using Scala 2.11.6 and precompiled Spark 1.4.1

Any suggestion on how to fix this would be much appreciated.