I encounter no issues with streaming from kafka to spark in 1.1.0. Do you perhaps have a version conflict?

Helena

On Nov 13, 2014 12:55 AM, "Jay Vyas" <jayunit100.apache@gmail.com> wrote:
Yup , very important that  n>1 for spark streaming jobs, If local use local[2].... 

The thing to remember is that your spark receiver will take a thread to itself and produce data , so u need another thread to consume it .

In a cluster manager like yarn or mesos, the word thread Is not used anymore, I guess has different meaning- you need 2 or more free compute slots, and that should be guaranteed by looking to see how many free node managers are running etc.

On Nov 12, 2014, at 7:53 PM, "Shao, Saisai" <saisai.shao@intel.com> wrote:

Did you configure Spark master as local, it should be local[n], n > 1 for local mode. Beside there’s a Kafka wordcount example in Spark Streaming example, you can try that. I’ve tested with latest master, it’s OK.

 

Thanks

Jerry

 

From: Tobias Pfeiffer [mailto:tgp@preferred.jp]
Sent: Thursday, November 13, 2014 8:45 AM
To: Bill Jay
Cc: user@spark.incubator.apache.org
Subject: Re: Spark streaming cannot receive any message from Kafka

 

Bill,

 

However, when I am currently using Spark 1.1.0. the Spark streaming job cannot receive any messages from Kafka. I have not made any change to the code.

 

Do you see any suspicious messages in the log output?

 

Tobias