spark-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <ak...@sigmoidanalytics.com>
Subject Re: ReceiverStream SPARK not able to cope up with 20,000 events /sec .
Date Tue, 28 Jul 2015 07:51:18 GMT
You need to find the bottleneck here, it could your network (if the data is
huge) or your producer code isn't pushing at 20k/s, If you are able to
produce at 20k/s then make sure you are able to receive at that rate (try
it without spark).

Thanks
Best Regards

On Sat, Jul 25, 2015 at 3:29 PM, anshu shukla <anshushukla0@gmail.com>
wrote:

> My eventGen is emitting 20,000  events/sec ,and I am using store(s1)  in receive()  method
to push data to receiverStream .
>
> But this logic is working fine for upto 4000 events/sec and no batch are seen emitting
for larger rate .
>
> *CODE:TOPOLOGY -*
>
>
> *JavaDStream<String> sourcestream = ssc.receiverStream(        new TetcCustomEventReceiver(datafilename,spoutlog,argumentClass.getScalingFactor(),datasetType));*
>
> *CODE:TetcCustomEventReceiver -*
>
> public void receive(List<String> event) {
> StringBuffer tuple=new StringBuffer();
> msgId++;
> for(String s:event)
>         {
>             tuple.append(s).append(",");
>         }
> String s1=MsgIdAddandRemove.addMessageId(tuple.toString(),msgId);
> store(s1);
>     }
>
>
>
>
> --
> Thanks & Regards,
> Anshu Shukla
>

Mime
View raw message