spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Greg <g...@zooniverse.org>
Subject issue with Scala, Spark and Akka
Date Tue, 20 May 2014 15:01:53 GMT
Hi,
I have the following Scala code:
===---
import org.apache.spark.SparkContext

class test {
  def main(){
    val sc = new SparkContext("local", "Scala Word Count")
  }
}
===---
and the following build.sbt file
===---
name := "test"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "0.9.1"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "1.0.4"

libraryDependencies += "org.mongodb" % "mongo-java-driver" % "2.11.4"

libraryDependencies += "org.mongodb" % "mongo-hadoop-core" % "1.0.0"

resolvers += "Akka Repository" at "http://repo.akka.io/releases/
===---
I get the following error:
com.typesafe.config.ConfigException$Missing: No configuration setting found
for key 'akka.version'
	at
com.typesafe.config.impl.SimpleConfig.findKey(test.sc3587202794988350330.tmp:111)
	at
com.typesafe.config.impl.SimpleConfig.find(test.sc3587202794988350330.tmp:132)


Any suggestions on how to fix this?
thanks, Greg




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/issue-with-Scala-Spark-and-Akka-tp6103.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Mime
View raw message