[ https://issues.apache.org/jira/browse/SPARK-3761?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14154982#comment-14154982 ] Igor Tkachenko commented on SPARK-3761: --------------------------------------- if I change code to: val count2 = sc .textFile("hdfs://:8020/tmp/data/risk/account.txt") //.filter(line => line.contains("Word")) it seems it can't work with anonymous function .count() it does work! > Class not found exception / sbt 13.5 / Scala 2.10.4 > --------------------------------------------------- > > Key: SPARK-3761 > URL: https://issues.apache.org/jira/browse/SPARK-3761 > Project: Spark > Issue Type: Bug > Affects Versions: 1.0.0 > Reporter: Igor Tkachenko > > I have Scala code: > val master = "spark://:7077" > val sc = new SparkContext(new SparkConf() > .setMaster(master) > .setAppName("SparkQueryDemo 01") > .set("spark.executor.memory", "512m")) > val count2 = sc .textFile("hdfs://:8020/tmp/data/risk/account.txt") > .filter(line => line.contains("Word")) > .count() > I've got such an error: > [error] (run-main-0) org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:0 failed 4 times, most > recent failure: Exception failure in TID 6 on host : java.lang.ClassNotFoundExcept > ion: SimpleApp$$anonfun$1 > My dependencies : > object Version { > val spark = "1.0.0-cdh5.1.0" > val hadoop = "2.4.1" > val slf4j = "1.7.6" > val logback = "1.1.1" > val scalaTest = "2.1.0" > val mockito = "1.9.5" > } > object Library { > val sparkCore = "org.apache.spark" %% "spark-assembly" % Version.spark > val hadoopClient = "org.apache.hadoop" % "hadoop-client" % Version.hadoop > val slf4jApi = "org.slf4j" % "slf4j-api" % Version.slf4j > val logbackClassic = "ch.qos.logback" % "logback-classic" % Version.logback > val scalaTest = "org.scalatest" %% "scalatest" % Version.scalaTest > val mockitoAll = "org.mockito" % "mockito-all" % Version.mockito > } > My OS is Win 7 -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org For additional commands, e-mail: issues-help@spark.apache.org