spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Tien Dat <tphan....@gmail.com>
Subject Retry option and range resource configuration for Spark job on Mesos
Date Fri, 06 Jul 2018 14:42:22 GMT
Dear all,

We are running Spark with Mesos as the resource manager. We are interesting
in some aspect, such as:

1, Is it possible to configure a specific job with a number of maximum
retries?
I meant here is the retry at job level, NOT the /spark.task.maxFailures/
which is for the task with a job.

2, Is it possible to set a job with a range of resource, such as: at least
20 CPU cores, at most 30 CPU cores and at least 20GB of mem, at most 40GB?

Thank you in advance.

Best 
Tien Dat



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Mime
View raw message