spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Elkhan Dadashov <elkhan8...@gmail.com>
Subject Can i get callback notification on Spark job completion ?
Date Fri, 28 Oct 2016 17:23:55 GMT
Hi,

I know that we can use SparkAppHandle (introduced in SparkLauncher version
>=1.6), and lt the delegator map task stay alive until the Spark job
finishes. But i wonder, if this can be done via callback notification
instead of polling.

Can i get callback notification on Spark job completion ?

Similar to Hadoop, get a callback on MapReduce job completion - getting a
notification instead of polling.

At job completion, an HTTP request will be sent to
“job.end.notification.url” value. Can be retrieved from notification URL
both the JOB_ID and JOB_STATUS.

...
Configuration conf = this.getConf();
// Set the callback parameters
conf.set("job.end.notification.url", "
https://hadoopi.wordpress.com/api/hadoop/notification/$*jobId*?status=$
*jobStatus*");
...
// Submit your job in background
job.submit();

At job completion, an HTTP request will be sent to
“job.end.notification.url” value:

https://
<callback-rul>/api/hadoop/notification/job_1379509275868_0002?status=SUCCEEDED

Reference:
https://hadoopi.wordpress.com/2013/09/18/hadoop-get-a-callback-on-mapreduce-job-completion/


Thanks.

Mime
View raw message