spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Dibyendu Bhattacharya <dibyendu.bhattach...@gmail.com>
Subject Re: Low Level Kafka Consumer for Spark
Date Sat, 17 Jan 2015 05:38:32 GMT
My code handles the Kafka Consumer part. But writing to Kafka may not be a
big challenge which you can easily do in your driver code.

dibyendu

On Sat, Jan 17, 2015 at 9:43 AM, Debasish Das <debasish.das83@gmail.com>
wrote:

> Hi Dib,
>
> For our usecase I want my spark job1 to read from hdfs/cache and write to
> kafka queues. Similarly spark job2 should read from kafka queues and write
> to kafka queues.
>
> Is writing to kafka queues from spark job supported in your code ?
>
> Thanks
> Deb
>  On Jan 15, 2015 11:21 PM, "Akhil Das" <akhil@sigmoidanalytics.com> wrote:
>
>> There was a simple example
>> <https://github.com/dibbhatt/kafka-spark-consumer/blob/master/examples/scala/LowLevelKafkaConsumer.scala#L45>
>> which you can run after changing few lines of configurations.
>>
>> Thanks
>> Best Regards
>>
>> On Fri, Jan 16, 2015 at 12:23 PM, Dibyendu Bhattacharya <
>> dibyendu.bhattachary@gmail.com> wrote:
>>
>>> Hi Kidong,
>>>
>>> Just now I tested the Low Level Consumer with Spark 1.2 and I did not
>>> see any issue with Receiver.Store method . It is able to fetch messages
>>> form Kafka.
>>>
>>> Can you cross check other configurations in your setup like Kafka broker
>>> IP , topic name, zk host details, consumer id etc.
>>>
>>> Dib
>>>
>>> On Fri, Jan 16, 2015 at 11:50 AM, Dibyendu Bhattacharya <
>>> dibyendu.bhattachary@gmail.com> wrote:
>>>
>>>> Hi Kidong,
>>>>
>>>> No , I have not tried yet with Spark 1.2 yet. I will try this out and
>>>> let you know how this goes.
>>>>
>>>> By the way, is there any change in Receiver Store method happened in
>>>> Spark 1.2 ?
>>>>
>>>>
>>>>
>>>> Regards,
>>>> Dibyendu
>>>>
>>>>
>>>>
>>>> On Fri, Jan 16, 2015 at 11:25 AM, mykidong <mykidong@gmail.com> wrote:
>>>>
>>>>> Hi Dibyendu,
>>>>>
>>>>> I am using kafka 0.8.1.1 and spark 1.2.0.
>>>>> After modifying these version of your pom, I have rebuilt your codes.
>>>>> But I have not got any messages from ssc.receiverStream(new
>>>>> KafkaReceiver(_props, i)).
>>>>>
>>>>> I have found, in your codes, all the messages are retrieved correctly,
>>>>> but
>>>>> _receiver.store(_dataBuffer.iterator())  which is spark streaming
>>>>> abstract
>>>>> class's method does not seem to work correctly.
>>>>>
>>>>> Have you tried running your spark streaming kafka consumer with kafka
>>>>> 0.8.1.1 and spark 1.2.0 ?
>>>>>
>>>>> - Kidong.
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> View this message in context:
>>>>> http://apache-spark-user-list.1001560.n3.nabble.com/Low-Level-Kafka-Consumer-for-Spark-tp11258p21180.html
>>>>> Sent from the Apache Spark User List mailing list archive at
>>>>> Nabble.com.
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>>
>>>>>
>>>>
>>>
>>

Mime
View raw message