spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Silvio Fiorito <silvio.fior...@granturing.com>
Subject Re: Can Spark Provide Multiple Context Support?
Date Tue, 08 Sep 2015 17:23:45 GMT
Is the data from HDFS static or is it unique for each event in the stream? If it’s static,
you can just create the SparkContext, load the files from HDFS, then start a StreamingContext
with the existing SparkContext and go from there.

From: Rachana Srivastava
Date: Tuesday, September 8, 2015 at 1:12 PM
To: "user@spark.apache.org<mailto:user@spark.apache.org>"
Subject: Can Spark Provide Multiple Context Support?

Question: How does Spark support multiple context?

Background:  I have a stream of data coming to Spark from Kafka.   For each data in the stream
I want to download some files from HDFS and process the file data.  I have written code to
process the file from HDFS and I have code written to process stream data from Kafka using
SparkStreaming API.  I have not been able to link both.

Can you please let me know if it is feasible to create JavaRDD from file inside SparkStreamingRDD
job processing step?

Thanks,

Rachana
Mime
View raw message