spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <>
Subject Re: Processing multiple request in cluster
Date Thu, 25 Sep 2014 07:02:27 GMT
You can try spark on Mesos or Yarn since they have lot more support for
scheduling and all

Best Regards

On Thu, Sep 25, 2014 at 4:50 AM, Subacini B <> wrote:

> hi All,
> How to run concurrently multiple requests on same cluster.
> I have a program using *spark streaming context *which reads* streaming
> data* and writes it to HBase. It works fine, the problem is when multiple
> requests are submitted to cluster, only first request is processed as the
> entire cluster is used for this request. Rest of the requests are in
> waiting mode.
> i have set  spark.cores.max to 2 or less, so that it can process another
> request,but if there is only one request cluster is not utilized properly.
> Is there any way, that spark cluster can process streaming request
> concurrently at the same time effectively utitlizing cluster, something
> like sharkserver
> Thanks
> Subacini

View raw message