spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Nathan Marin <nathan.ma...@teads.tv>
Subject Re: Spark Streaming/Flume display all events
Date Mon, 30 Mar 2015 14:46:41 GMT
Hi,

DStream.print() only prints the first 10 elements contained in the Stream. You can call DStream.print(x)
to print the first x elements but if you don’t know the exact count you can call DStream.foreachRDD
and apply a function to display the content of every RDD.

For example:
stream.foreachRDD(rdd => println(rdd))

Regards,
NM

> Le 30 mars 2015 à 16:36, Chong Zhang <chongz.zhang@gmail.com> a écrit :
> 
> Hi,
> 
> I am new to Spark/Streaming, and tried to run modified FlumeEventCount.scala example
to display all events by adding the call:
> 
>     stream.map(e => "Event:header:" + e.event.get(0).toString  + "body: " + new String(e.event.getBody.array)).print()
               
> 
> The spark-submit runs fine with --master local[4], also display the count:
>      Received 18 flume events.
> 
> But the following logs only display 10 events and "..." after that. 
> 
> So how can I display all the events?
> 
> Thanks,
> Chong
> 
> 
> 


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Mime
View raw message