spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ravi Hemnani <>
Subject Using flume to create stream for spark streaming.
Date Mon, 10 Mar 2014 10:26:18 GMT

I am using the following flume flow,

Flume agent 1 consisting of Rabbitmq-> source, files-> channet, avro-> sink
sending data to a slave node of spark cluster. 
Flume agent 2, slave node of spark cluster, consisting of avro-> source,
files-> channel, now for the sink i tried avro, hdfs, file_roll as sink but
i am not able to read the DStream from any of these where for avro sink
type, i am giving sink address as the same slave node and some other port
and i am asking the spark streaming program to listen to slave node and the
port of the sink defined in the conf of slave node. Thus spark streaming is
giving me no result. 

I am running the program as java -jar <jar>  on master of the cluster. 

What should be the sink type that should be used on the slave node?
I have stuck on this since two weeks now and i am confused how to approach

Any help?

View this message in context:
Sent from the Apache Spark User List mailing list archive at

View raw message