spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jacques BasaldĂșa <>
Subject Problem with SparkR
Date Sun, 23 Mar 2014 23:48:45 GMT
I am really interested in using Spark from R and have tried to use SparkR,
but always get the same error.


This is how I installed:


 - I successfully installed Spark version  0.9.0 with Scala  2.10.3 (OpenJDK
64-Bit Server VM, Java 1.7.0_45)

   I can run examples from spark-shell and Python


 - I installed the R package devtools and installed SparkR using:


 - library(devtools)

 - install_github("amplab-extras/SparkR-pkg", subdir="pkg")


  This compiled the package successfully.


When I try to run the package




  sc <- sparkR.init(master="local")           //- so far the program runs


  rdd <- parallelize(sc, 1:10)  // This returns the following error


  Error in .jcall(getJRDD(rdd), "Ljava/util/List;", "collect") : 



No matter how I try to use the sc (I have tried all the examples) I always
get an error.


Any ideas?



View raw message