spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: Kafka Consumer in Spark Streaming
Date Wed, 05 Nov 2014 06:28:09 GMT
this code only expresses a transformation and so does not actually
cause any action. I think you intend to use foreachRDD.

On Wed, Nov 5, 2014 at 5:57 AM, Something Something
<mailinglists19@gmail.com> wrote:
> I've following code in my program.  I don't get any error, but it's not
> consuming the messages either.  Shouldn't the following code print the line
> in the 'call' method?  What am I missing?
>
> Please help.  Thanks.
>
>
>
>         JavaStreamingContext ssc = new JavaStreamingContext(sparkConf, new
> Duration(60 * 1 * 1000));
>
>         JavaPairReceiverInputDStream tweets = KafkaUtils.createStream(ssc,
> "<machine>:2181", "1", map);
>
>         JavaDStream<String> statuses = tweets.map(
>                 new Function<String, String>() {
>                     public String call(String status) {
>                         System.out.println(status);
>                         return status;
>                     }
>                 }
>         );
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message