spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tathagata Das <tathagata.das1...@gmail.com>
Subject Re: Any patterns for multiplexing the streaming data
Date Fri, 07 Nov 2014 21:25:01 GMT
I am not aware of any obvious existing pattern that does exactly this.
Generally this sort of computation (subset, denormalization) things are so
generic sounding terms but actually have very specific requirements that it
hard to refer to a design pattern without more requirement info.

If you want to feed back to kafka, you can take a look at this pull request

https://github.com/apache/spark/pull/2994

On Thu, Nov 6, 2014 at 4:15 PM, bdev <buntudev@gmail.com> wrote:

> We are looking at consuming the kafka stream using Spark Streaming and
> transform into various subsets like applying some transformation or
> de-normalizing some fields, etc. and feed it back into Kafka as a different
> topic for downstream consumers.
>
> Wanted to know if there are any existing patterns for achieving this.
>
> Thanks!
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Any-patterns-for-multiplexing-the-streaming-data-tp18303.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message