spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From bliang <bli...@thecarousell.com>
Subject MovieALS Implicit Error
Date Mon, 13 Jul 2015 10:55:35 GMT
Hi,I am trying to run the MovieALS example with an implicit dataset and am
receiving this error:
Got 3856988 ratings from 144250 users on 378937 movies.Training: 3085522,
test: 771466.15/07/13 10:43:07 WARN BLAS: Failed to load implementation
from: com.github.fommil.netlib.NativeSystemBLAS15/07/13 10:43:07 WARN BLAS:
Failed to load implementation from:
com.github.fommil.netlib.NativeRefBLAS15/07/13 10:43:10 WARN TaskSetManager:
Lost task 3.0 in stage 29.0 (TID 192, 10.162.45.33):
java.lang.AssertionError: assertion failed: lapack.dppsv returned 1.	at
scala.Predef$.assert(Predef.scala:179)	at
org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at
org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)	at
org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)	at
org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:242)	at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)	at
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:244)	at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)	at
org.apache.spark.scheduler.Task.run(Task.scala:70)	at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)15/07/13 10:43:10 ERROR
TaskSetManager: Task 12 in stage 29.0 failed 4 times; aborting jobException
in thread "main" org.apache.spark.SparkException: Job aborted due to stage
failure: Task 12 in stage 29.0 failed 4 times, most recent failure: Lost
task 12.3 in stage 29.0 (TID 249, 10.162.45.33): java.lang.AssertionError:
assertion failed: lapack.dppsv returned 1.	at
scala.Predef$.assert(Predef.scala:179)	at
org.apache.spark.ml.recommendation.ALS$CholeskySolver.solve(ALS.scala:386)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1163)
at
org.apache.spark.ml.recommendation.ALS$$anonfun$org$apache$spark$ml$recommendation$ALS$$computeFactors$1.apply(ALS.scala:1124)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at
org.apache.spark.rdd.PairRDDFunctions$$anonfun$mapValues$1$$anonfun$apply$41$$anonfun$apply$42.apply(PairRDDFunctions.scala:700)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:328)	at
org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:277)	at
org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:171)	at
org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:78)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:242)	at
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:35)	at
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:277)	at
org.apache.spark.rdd.RDD.iterator(RDD.scala:244)	at
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:63)	at
org.apache.spark.scheduler.Task.run(Task.scala:70)	at
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:213)	at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)Driver stacktrace:	at
org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1266)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1257)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1256)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)	at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1256)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:730)
at scala.Option.foreach(Option.scala:236)	at
org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:730)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450)
at
org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
Would it be possible to help me out?Thank you,Ben



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/MovieALS-Implicit-Error-tp23793.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
Mime
View raw message