spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jayant Shekhar <jayantbaya...@gmail.com>
Subject Getting a DataFrame back as result from SparkIMain
Date Wed, 22 Jun 2016 00:39:20 GMT
Hi,

I have written a program using SparkIMain which creates an RDD and I am
looking for a way to access that RDD in my normal spark/scala code for
further processing.


The code below binds the SparkContext::

sparkIMain.bind("sc", "org.apache.spark.SparkContext", sparkContext,
List("""@transient"""))


The code below creates an RDD::

val code = "val data = Array(1, 2, 3, 4, 5)"

sparkIMain.interpret(code)

val code1 = "val distData = sc.parallelize(data)"

sparkIMain.interpret(code1)


Looking for a way to get the RDD 'distData' back into the spark code for
further processing. Essentially I need to pass the RDD for the next steps
for processing.

Any pointers are welcome! The above framework is used in a more general
case where the code is supplied by the user at runtime.

Thanks,
Jayant

Mime
View raw message