spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Debasish Das <>
Subject Using mllib-1.1.0-SNAPSHOT on Spark 1.0.1
Date Sat, 02 Aug 2014 17:13:12 GMT

I have deployed spark stable 1.0.1 on the cluster but I have new code that
I added in mllib-1.1.0-SNAPSHOT.

I am trying to access the new code using spark-submit as follows:

spark-job --class com.verizon.bda.mllib.recommendation.ALSDriver
--executor-memory 16g --total-executor-cores 16 --jars
sag-core-0.0.1-SNAPSHOT.jar --rank 25 --numIterations 10 --lambda 1.0
--qpProblem 2 inputPath outputPath

I can see the jars are getting added to httpServer as expected:

14/08/02 12:50:04 INFO SparkContext: Added JAR
file:/vzhome/v606014/spark-glm/spark-mllib_2.10-1.1.0-SNAPSHOT.jar at with
timestamp 1406998204236

14/08/02 12:50:04 INFO SparkContext: Added JAR
file:/vzhome/v606014/spark-glm/scopt_2.10-3.2.0.jar at with timestamp

14/08/02 12:50:04 INFO SparkContext: Added JAR
file:/vzhome/v606014/spark-glm/sag-core-0.0.1-SNAPSHOT.jar at with timestamp

But the job still can't access code form mllib-1.1.0 SNAPSHOT.jar...I think
it's picking up the mllib from cluster which is at 1.0.1...

Please help. I will ask for a PR tomorrow but internally we want to
generate results from the new code.



  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message