Hi
 you  might want to have a look at the Regression ML  algorithm and integrate it in your SparkStreaming application, i m sure someone on the list has  a similar use case
shortly, you'd want to process all your events and feed it through a ML  model which,based on your inputs will predict output
You say that your events predict minutes values for next 2-3 hrs... gather data for a day and train ur model based on that. Then save it somewhere and have your streaming app load the module and have the module do the predictions based on incoming events from your streaming app.
Save the results somewhere and have your dashboard poll periodically your data store to read the predictions
I have seen ppl on the list doing ML over a Spark streaming app, i m sure someone can reply back....
Hpefully i gave u a starting point....

hth
 marco

On 2 Jan 2017 4:03 pm, "Daniela S" <daniela_4444@gmx.at> wrote:
Hi
 
I am trying to solve the following problem with Spark Streaming.
I receive timestamped events from Kafka. Each event refers to a device and contains values for every minute of the next 2 to 3 hours. What I would like to do is to predict the minute values for the next 24 hours. So I would like to use the known values and to predict the other values to achieve the 24 hours prediction. My thought was to use arrays with a length of 1440 (1440 minutes = 24 hours). One for the known values and one for the predicted values for each device. Then I would like to show the next 24 hours on a dashboard. The dashboard should be updated automatically in realtime. 
 
My questions:
is this a possible solution?
how is it possible to combine known future values and predicted values?
how should I treat the timestamp as the length of 1440 does not correspond to a timestamp?
how is it possible to update the dashboard automatically in realtime?
 
Thank you in advance!
 
Best regards,
Daniela
--------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscribe@spark.apache.org