spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Ofer Eliassaf <>
Subject Dynamic Resource Allocation in a standalone
Date Thu, 27 Oct 2016 08:00:20 GMT

I have a question/problem regarding dynamic resource allocation.
I am using spark 1.6.2 with stand alone cluster manager.

I have one worker with 2 cores.

I set the the folllowing arguments in the spark-defaults.conf file on all
my nodes:

spark.dynamicAllocation.enabled  true
spark.shuffle.service.enabled true
spark.deploy.defaultCores 1

I run a sample application with many tasks.

I open port 4040 on the driver and i can verify that the above
configuration exists.

My problem is that no matter what i do my application only gets 1 core even
though the other cores are available.

Is this normal or do i have a problem in my configuration.

The behaviour i want to get is this:
I have many users working with the same spark cluster.
I want that each application will get a fixed number of cores unless the
rest of the clutser is pending.
In this case I want that the runn ing applications will get the total
amount of cores until a new application arrives...

Ofer Eliassaf

View raw message