spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Shams ul Haque <sham...@cashcare.in>
Subject Re: Spark streaming app starts processing when kill that app
Date Tue, 03 May 2016 07:27:24 GMT
No, i made a cluster of 2 machines. And after submission to master, this
app moves on slave machine for execution.
Well i am going to give a try to your suggestion by running both on same
machine.

Thanks
Shams

On Tue, May 3, 2016 at 12:53 PM, hareesh makam <makamhareesh@gmail.com>
wrote:

> If you are running your master on a single core, it might be an issue of
> Starvation.
> assuming you are running it locally, try setting master to local[2] or
> higher.
>
> Check the first example at
> https://spark.apache.org/docs/latest/streaming-programming-guide.html
>
> - Hareesh
>
> On 3 May 2016 at 12:35, Shams ul Haque <shamsul@cashcare.in> wrote:
>
>> Hi all,
>>
>> I am facing strange issue when running Spark Streaming app.
>>
>> What i was doing is, When i submit my app by *spark-submit *it works
>> fine and also visible in Spark UI. But it doesn't process any data coming
>> from kafka. And when i kill that app by pressing Ctrl + C on terminal, then
>> it start processing all data received from Kafka and then get shutdown.
>>
>> I am trying to figure out why is this happening. Please help me if you
>> know anything.
>>
>> Thanks and regards
>> Shams ul Haque
>>
>
>

Mime
View raw message