spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nick Pentreath <nick.pentre...@gmail.com>
Subject Expect only DirectTaskResults when using LocalScheduler
Date Thu, 16 Jan 2014 12:37:37 GMT
This has me puzzled.

I'm using 0.8.1-incubating, and trying to run a pretty simple mapValues on
an RDD that is the result of computing an MLlib ALS model (so it is RDD[(Int,
Array[Double])] )

I get the following failure which I've never come across before.

org.apache.spark.SparkException (org.apache.spark.SparkException: Expect
only DirectTaskResults when using LocalScheduler)

org.apache.spark.scheduler.local.LocalTaskSetManager.taskEnded(LocalTaskSetManager.scala:147)
org.apache.spark.scheduler.local.LocalScheduler.statusUpdate(LocalScheduler.scala:200)
org.apache.spark.executor.Executor$TaskRunner$$anonfun$run$1.apply$mcV$sp(Executor.scala:252)
org.apache.spark.deploy.SparkHadoopUtil.runAsUser(SparkHadoopUtil.scala:50)
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:182)
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
java.lang.Thread.run(Thread.java:724)

Mime
View raw message