spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mayur Rustagi <mayur.rust...@gmail.com>
Subject Re: Processing multiple request in cluster
Date Thu, 25 Sep 2014 09:48:55 GMT
There are two problems you may be facing.
1. your application is taking all resources
2. inside your application task submission is not scheduling properly.

for 1  you can either configure your app to take less resources or use
mesos/yarn types scheduler to dynamically change or juggle resources
for 2. you can use fair scheduler so that application tasks can be
scheduled more fairly.

Regards
Mayur

Mayur Rustagi
Ph: +1 (760) 203 3257
http://www.sigmoidanalytics.com
@mayur_rustagi <https://twitter.com/mayur_rustagi>


On Thu, Sep 25, 2014 at 12:32 PM, Akhil Das <akhil@sigmoidanalytics.com>
wrote:

> You can try spark on Mesos or Yarn since they have lot more support for
> scheduling and all
>
> Thanks
> Best Regards
>
> On Thu, Sep 25, 2014 at 4:50 AM, Subacini B <subacini@gmail.com> wrote:
>
>> hi All,
>>
>> How to run concurrently multiple requests on same cluster.
>>
>> I have a program using *spark streaming context *which reads* streaming
>> data* and writes it to HBase. It works fine, the problem is when
>> multiple requests are submitted to cluster, only first request is processed
>> as the entire cluster is used for this request. Rest of the requests are in
>> waiting mode.
>>
>> i have set  spark.cores.max to 2 or less, so that it can process another
>> request,but if there is only one request cluster is not utilized properly.
>>
>> Is there any way, that spark cluster can process streaming request
>> concurrently at the same time effectively utitlizing cluster, something
>> like sharkserver
>>
>> Thanks
>> Subacini
>>
>
>

Mime
View raw message