nifi-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mike Thomsen <mikerthom...@gmail.com>
Subject Re: ExecuteStreamCommand to submit a Spark Job
Date Tue, 06 Feb 2018 19:36:37 GMT
As a work around, you could turn your shell script into a template and use
PutFile to put a copy of it in there with the attributes from NiFi injected
into the body and run it with ExecuteProcess since that one seems to work.

On Tue, Feb 6, 2018 at 11:28 AM, Karthik Kothareddy (karthikk) [CONT - Type
2] <karthikk@micron.com> wrote:

> Hello All,
>
>
>
> I’m using NiFi 1.4.0 and was playing with the *ExecuteStreamCommand*
> processor to submit spark jobs. I wrapped the spark-submit command in a
> shell script and tested it first with *ExecuteProcess* and everything
> works as expected, however according to my use case I need to use many
> attributes from the flow before it as well as trigger few jobs only if
> flowfiles with certain attributes are received. So, I started considering *ExecuteStreamCommand
> *processor and I used the same shell script to submit the spark job. The
> problem is the job gets hung somewhere and never finishes, stopping the
> processor didn’t help and I even killed the job on my Linux box (with it’s
> PID) and the processor is still running. The only way that I could start
> this hung processor is by restarting. It’s not making sense why this is not
> working while ExecuteProcess works perfectly. Does anyone have similar
> problems? Below are my configurations
>
>
>
> (Also, the box that I’m running NiFi is an edge node)
>
>
>
> NiFi version - 1.4.0 (not HDP)
>
> Spark version -  2.1.1.2.6.2.0-205 (HDP)
>
> Spark-submit client – YARN
>
>
>
> Thanks
>
> Karthik
>
>
>

Mime
View raw message