I am trying to send events to spark streaming via flume.
It's working fine up to a certain point.
I have problems when the size of the body is over 1020 characters.

Basically, up to 1020 it works
1021 through 1024, the event will be accepted and there is no exception, but the channel seems to be corrupted. Or at least no more events can make it through.
1025 and up, I am seeing some exceptions : java.io.StreamCorruptedException: invalid stream header: 00000000

I created a test using the Embedded Agent and a local spark to expose the problem.
I am using the cdh5 distribution.

File is attached.

Has anybody ever seen this?
Any suggestions?


*********************************************************************** This e-mail and attachments are confidential, legally privileged, may be subject to copyright and sent solely for the attention of the addressee(s). Any unauthorized use or disclosure is prohibited. Statements and opinions expressed in this e-mail may not represent those of Radialpoint. ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Le contenu du présent courriel est confidentiel, privilégié et peut être soumis à des droits d'auteur. Il est envoyé à l'intention exclusive de son ou de ses destinataires. Il est interdit de l'utiliser ou de le divulguer sans autorisation. Les opinions exprimées dans le présent courriel peuvent diverger de celles de Radialpoint.

SparkFlumeStreamTest.java (11K) Download Attachment

View this message in context: Spark Streaming using Flume body size limitation
Sent from the Apache Spark User List mailing list archive at Nabble.com.