spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tobias Pfeiffer <...@preferred.jp>
Subject Multi-tenancy for Spark (Streaming) Applications
Date Thu, 04 Sep 2014 01:30:30 GMT
Hi,

I am not sure if "multi-tenancy" is the right word, but I am thinking about
a Spark application where multiple users can, say, log into some web
interface and specify a data processing pipeline with streaming source,
processing steps, and output.

Now as far as I know, there can be only one StreamingContext per JVM and
also I cannot add sources or processing steps once it has been started. Are
there any ideas/suggestinos for how to achieve a dynamic adding and
removing of input sources and processing pipelines? Do I need a separate
'java' process per user?
Also, can I realize such a thing when using YARN for dynamic allocation?

Thanks
Tobias

Mime
View raw message