spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Gilad Landau <>
Subject Using R code as part of a Spark Application
Date Wed, 29 Jun 2016 13:40:29 GMT

I want to use R code as part of spark application (the same way I would do with Scala/Python).
 I want to be able to run an R syntax as a map function on a big Spark dataframe loaded from
a parquet file.
Is this even possible or the only way to use R is as part of RStudio orchestration of our
Spark  cluster?

Thanks for the help!


View raw message