spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Subacini B <subac...@gmail.com>
Subject Processing multiple request in cluster
Date Wed, 24 Sep 2014 23:20:18 GMT
hi All,

How to run concurrently multiple requests on same cluster.

I have a program using *spark streaming context *which reads* streaming
data* and writes it to HBase. It works fine, the problem is when multiple
requests are submitted to cluster, only first request is processed as the
entire cluster is used for this request. Rest of the requests are in
waiting mode.

i have set  spark.cores.max to 2 or less, so that it can process another
request,but if there is only one request cluster is not utilized properly.

Is there any way, that spark cluster can process streaming request
concurrently at the same time effectively utitlizing cluster, something
like sharkserver

Thanks
Subacini

Mime
View raw message