I am new to Spark/Streaming, and tried to run modified FlumeEventCount.scala example to display all events by adding the call:
stream.map(e => "Event:header:" + e.event.get(0).toString + "body: " + new String(e.event.getBody.array)).print()
The spark-submit runs fine with --master local, also display the count:
Received 18 flume events.
But the following logs only display 10 events and "..." after that.
So how can I display all the events?