spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jay Vyas <jayunit100.apa...@gmail.com>
Subject Re: Spark streaming cannot receive any message from Kafka
Date Thu, 13 Nov 2014 05:54:29 GMT
Yup , very important that  n>1 for spark streaming jobs, If local use local[2].... 

The thing to remember is that your spark receiver will take a thread to itself and produce
data , so u need another thread to consume it .

In a cluster manager like yarn or mesos, the word thread Is not used anymore, I guess has
different meaning- you need 2 or more free compute slots, and that should be guaranteed by
looking to see how many free node managers are running etc.

> On Nov 12, 2014, at 7:53 PM, "Shao, Saisai" <saisai.shao@intel.com> wrote:
> 
> Did you configure Spark master as local, it should be local[n], n > 1 for local mode.
Beside there’s a Kafka wordcount example in Spark Streaming example, you can try that. I’ve
tested with latest master, it’s OK.
>  
> Thanks
> Jerry
>  
> From: Tobias Pfeiffer [mailto:tgp@preferred.jp] 
> Sent: Thursday, November 13, 2014 8:45 AM
> To: Bill Jay
> Cc: user@spark.incubator.apache.org
> Subject: Re: Spark streaming cannot receive any message from Kafka
>  
> Bill,
>  
> However, when I am currently using Spark 1.1.0. the Spark streaming job cannot receive
any messages from Kafka. I have not made any change to the code.
>  
> Do you see any suspicious messages in the log output?
>  
> Tobias
>  

Mime
View raw message