spark-user mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From 郭谦 <>
Subject problem for submitting job
Date Sun, 28 Jun 2015 07:56:51 GMT

I'm a junior user of spark from China.

I have a problem about submit spark job right now. I want to submit job
from code.

In other words ,"How to submit spark job from within java program to yarn
cluster without using spark-submit"

       I've learnt from official site

that using  bin/spark-submit script to submit a job to cluster is easy .

       Because the script may does lots of complex work such as setting up
the classpath with Spark and its dependencies.

If I don't use the script ,I have to deal with all complex work by
myself.It makes me feel really frustrated.

       I have search this problem from Google,but the answers may not suit
for me .

       In hadoop developing ,I know that after setting up Configuration
,Job and resources ,

we can submit hadoop job by coding like this:


It is convenient for users to submit job programmatically

I want to know if there is a schedule( may be in spark 1.5+?)that provide
users variety ways of submitting job like hadoop .

Like monitoring ,In the recent release spark(1.4.0) We can get statements
about spark applications by REST API right now.

Thanks & Regards


View raw message