spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Sean Owen <so...@cloudera.com>
Subject Re: flume spark streaming receiver host random
Date Fri, 26 Sep 2014 06:09:35 GMT
I think you may be missing a key word here. Are you saying that the machine
has multiple interfaces and it is not using the one you expect or the
receiver is not running on the machine you expect?
On Sep 26, 2014 3:33 AM, "centerqi hu" <centerqi@gmail.com> wrote:

> Hi all
> My code is as follows:
>
> /usr/local/webserver/sparkhive/bin/spark-submit
> --class org.apache.spark.examples.streaming.FlumeEventCount
> --master yarn
> --deploy-mode cluster
> --queue  online
> --num-executors 5
> --driver-memory 6g
> --executor-memory 20g
> --executor-cores 5 target/scala-2.10/simple-project_2.10-1.0.jar
> 10.1.15.115 60000
>
> However, the receiver does not in the 10.1.15.115, but the random
> choice of one slave host.
>
> How to solve this problem?
>
> Thanks
>
>
> --
> centerqi@gmail.com|齐忠
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Mime
View raw message