spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jianshi Huang <jianshi.hu...@gmail.com>
Subject Dynamically loaded Spark-stream consumer
Date Thu, 23 Oct 2014 07:36:08 GMT
I have a use case that I need to continuously ingest data from Kafka
stream. However apart from ingestion (to HBase), I also need to compute
some metrics (i.e. avg for last min, etc.).

The problem is that it's very likely I'll continuously add more metrics and
I don't want to restart my spark program from time to time.

Is there a mechanism that Spark stream can load and plugin code in runtime
without restarting?

Any solutions or suggestions?

Thanks,
-- 
Jianshi Huang

LinkedIn: jianshi
Twitter: @jshuang
Github & Blog: http://huangjs.github.com/

Mime
View raw message