spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From "Shao, Saisai" <saisai.s...@intel.com>
Subject RE: Kafka Consumer in Spark Streaming
Date Wed, 05 Nov 2014 06:14:36 GMT
Hi, would you mind describing your problem a little more specific.


1.      Is the Kafka broker currently has no data feed in?

2.      This code will print the lines, but not in the driver side, the code is running in
the executor side, so you can check the log in worker dir to see if there’s any printing
logs under this folder.

3.      Did you see any exceptions when running the app, this will help to define the problem.

Thanks
Jerry

From: Something Something [mailto:mailinglists19@gmail.com]
Sent: Wednesday, November 05, 2014 1:57 PM
To: user@spark.apache.org
Subject: Kafka Consumer in Spark Streaming

I've following code in my program.  I don't get any error, but it's not consuming the messages
either.  Shouldn't the following code print the line in the 'call' method?  What am I missing?

Please help.  Thanks.



        JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new Duration(60 * 1
* 1000));

        JavaPairReceiverInputDStream tweets = KafkaUtils.createStream(ssc, "<machine>:2181",
"1", map);

        JavaDStream<String> statuses = tweets.map(
                new Function<String, String>() {
                    public String call(String status) {
                        System.out.println(status);
                        return status;
                    }
                }
        );
Mime
View raw message