spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Michal ńĆizmazia <mici...@gmail.com>
Subject Re: Parallelism of Custom receiver in spark
Date Sun, 26 Jul 2015 11:04:43 GMT
#1 see
https://spark.apache.org/docs/latest/streaming-programming-guide.html#level-of-parallelism-in-data-receiving

#2 By default, all input data and persisted RDDs generated by DStream
transformations are automatically cleared. Spark Streaming decides when to
clear the data based on the transformations that are used. See
https://spark.apache.org/docs/latest/streaming-programming-guide.html#memory-tuning

Hope this helps.




On 25 July 2015 at 13:43, anshu shukla <anshushukla0@gmail.com> wrote:

> 1 - How to increase the level of *parallelism in spark streaming  custom
>  RECEIVER* .
>
> 2 -  Will  ssc.receiverstream(/**anything //)   will *delete the data
> stored in spark memory using store(s) * logic .
>
> --
> Thanks & Regards,
> Anshu Shukla
>

Mime
View raw message