spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Akhil Das <>
Subject Re: running multiple applications at the same time
Date Thu, 26 Jun 2014 12:19:05 GMT
​​Hi Jamborta,

You can use the following options in your application to limit the usage of
resources, like

   - spark.cores.max
   - spark.executor.memory

Its better to use Mesos if you want to run multiple applications on the
same cluster smoothly.

Best Regards

On Thu, Jun 26, 2014 at 5:37 PM, jamborta <> wrote:

> Hi all,
> not sure if this is a config issue or it's by design, but when I run the
> spark shell, and try to submit another application from elsewhere, the
> second application waits for the first to finish and outputs the following:
> Initial job has not accepted any resources; check your cluster UI to ensure
> that workers are registered and have sufficient memory.
> I have four workers, each have some additional resources to take up the new
> application.
> thanks,
> --
> View this message in context:
> Sent from the Apache Spark User List mailing list archive at

View raw message