spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Oded Maimon <>
Subject Few basic spark questions
Date Sun, 12 Jul 2015 13:49:04 GMT
Hi All,
we are evaluating spark for real-time analytic. what we are trying to do is
the following:

   - READER APP- use custom receiver to get data from rabbitmq (written in
   - ANALYZER APP - use spark R application to read the data (windowed),
   analyze it every minute and save the results inside spark
   - OUTPUT APP - user spark application (scala/java/python) to read the
   results from R every X minutes and send the data to few external systems

basically at the end i would like to have the READER COMPONENT as an app
that always consumes the data and keeps it in spark,
have as many ANALYZER COMPONENTS as my data scientists wants, and have one
OUTPUT APP that will read the ANALYZER results and send it to any relevant

what is the right way to do it?



*This email and any files transmitted with it are confidential and intended 
solely for the use of the individual or entity to whom they are 
addressed. Please note that any disclosure, copying or distribution of the 
content of this information is strictly forbidden. If you have received 
this email message in error, please destroy it immediately and notify its 

View raw message