spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From sujeet jog <sujeet....@gmail.com>
Subject Re: Using R code as part of a Spark Application
Date Wed, 29 Jun 2016 13:54:05 GMT
try Spark pipeRDD's , you can invoke the R script from pipe , push  the
stuff you want to do on the Rscript stdin,  p


On Wed, Jun 29, 2016 at 7:10 PM, Gilad Landau <Gilad.Landau@clicktale.com>
wrote:

> Hello,
>
>
>
> I want to use R code as part of spark application (the same way I would do
> with Scala/Python).  I want to be able to run an R syntax as a map function
> on a big Spark dataframe loaded from a parquet file.
>
> Is this even possible or the only way to use R is as part of RStudio
> orchestration of our Spark  cluster?
>
>
>
> Thanks for the help!
>
>
>
> Gilad
>
>
>

Mime
View raw message