spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Sun, Rui" <rui....@intel.com>
Subject RE: Share RDD from SparkR and another application
Date Tue, 14 Jul 2015 07:26:49 GMT
Hi, hari,

I don't think job-server can work with SparkR (also pySpark). It seems it would be technically
possible but needs support from job-server and SparkR(also pySpark), which doesn't exist yet.

But there may be some in-direct ways of sharing RDDs between SparkR and an application. For
example, you may save the RDD in the application as a text file which can be later loaded
by SparkR to create an SparkR RDD, or vice vesa. (unfortunately, SparkR now only supports
creating RDDs from text files instead of any arbitrary Hadoop files.)  It would be easier
if your application can use DataFrame. SparkR can create and manipulates DataFrames from virtually
any external data sources, which can be used as intermediate media for data exchange.
________________________________________
From: harirajaram [hari.rajaram@gmail.com]
Sent: Monday, July 13, 2015 8:30 PM
To: user@spark.apache.org
Subject: Share RDD from SparkR and another application

Hello,
I would like to share RDD between an application and sparkR.
I understand we have job-server and IBM kernel for sharing the context for
different applications but not sure how we can use it with sparkR as it is
some sort of front end (R shell) with spark.
Any insights appreciated.

Hari





--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Share-RDD-from-SparkR-and-another-application-tp23795.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message