spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From lec ssmi <shicheng31...@gmail.com>
Subject Re: Using two WriteStreams in same spark structured streaming job
Date Thu, 05 Nov 2020 01:36:25 GMT
you can use *foreach* sink to  achieve the logic you want.

act_coder <accthon@gmail.com> 于2020年11月4日周三 下午9:56写道:

> I have a scenario where I would like to save the same streaming dataframe
> to
> two different streaming sinks.
>
> I have created a streaming dataframe which I need to send to both Kafka
> topic and delta lake.
>
> I thought of using forEachBatch, but looks like it doesn't support multiple
> STREAMING SINKS.
>
> Also, I tried using spark session.awaitAnyTermination() with multiple write
> streams. But the second stream is not getting processed !
>
> Is there a way through which we can achieve this ?!
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscribe@spark.apache.org
>
>

Mime
View raw message