spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tien Dat <>
Subject Retry option and range resource configuration for Spark job on Mesos
Date Fri, 06 Jul 2018 14:42:22 GMT
Dear all,

We are running Spark with Mesos as the resource manager. We are interesting
in some aspect, such as:

1, Is it possible to configure a specific job with a number of maximum
I meant here is the retry at job level, NOT the /spark.task.maxFailures/
which is for the task with a job.

2, Is it possible to set a job with a range of resource, such as: at least
20 CPU cores, at most 30 CPU cores and at least 20GB of mem, at most 40GB?

Thank you in advance.

Tien Dat

Sent from:

To unsubscribe e-mail:

View raw message