nifi-dev mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Bryan Bende <bbe...@gmail.com>
Subject Re: Suggestion required HTTP and S3
Date Tue, 11 Sep 2018 14:03:46 GMT
I think the existing processors such as HandleHttpRequest can be used.
The body of the POST will become the flow file content, and the
headers will become flow file attributes.

After HandleHttpRequest you can use RouteOnAttribute to make a
decision based on one of the headers (flow file attributes).

If it makes your criteria then you send it to a PutS3Object processor
to write the flow file contents to S3.

The success relationship of PutS3Object will have the flow file with
new attributes added for s3.bucket, s3.key, etc, which may be enough
for you to construct the URL.

You could then use ReplaceText to overwrite the content of the flow
with some dynamic expression like ${host}/${s3.bucket}/${s3.key} (or
whatever the URL is).

Then send to PublishKafka.

Thanks,

Bryan


On Tue, Sep 11, 2018 at 9:55 AM, Rajesh Biswas
<rajesh.biswas@bridgera.com> wrote:
> Dear Experts,
>
> I need your suggestion to design a requirement for my current project.
>
> I would like to listen for HTTP post request thru NiFi which will have image
> file attached.
>
> I need to 1st extract the HTTP header information (to understand the type of
> request), extract the image file and store the image file to S3 storage.
>
> Would you please suggest if I require to write any custom processor or
> existing Processors are sufficient.
>
> Also I would like to pass the file URL in S3 to a Kafka queue, would you
> please suggest what is the best way to get the S3 URL.
>
>
>
> Thanks and Regards,
>
> Rajesh Biswas | +91 9886433461 |  <http://www.bridgera.com/>
> www.bridgera.com
>
>
>
>
>
> ---
> This email has been checked for viruses by Avast antivirus software.
> https://www.avast.com/antivirus

Mime
View raw message