nifi-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From John Fak <johnfa...@gmail.com>
Subject Re: Possible ?
Date Tue, 01 Oct 2019 21:04:26 GMT
Thanks.
Really want I want to be able to do is insert data into 1 master control
database - and have it perform functions on a series of other databases -
depending on the flow definitions.

So an example is reset a users password on every system. The flow would
read data from a single source table with security  ....and with all
theflows - reset the password for that user on say 20 databases.
Appropriate controls via a stored proc or something.

One of the things for example is  (this is couchbase) - you manually have
to enter the ADMIN password for each connection endpoint.  So becomes a
very manual/tedious process if it cant be dynamic somehow ?
It would be great if you can pass that at runtime somehow ? eg Call a
script to set a parameter or whatever  and use that here.


[image: image.png]




On Tue, Oct 1, 2019 at 12:59 PM Joe Witt <joe.witt@gmail.com> wrote:

> If you want it to monitor the database it will need to be separate flows.
> But you can isolate the logic of watching each database/table from the
> logic of what you do after.  The flow would look like say 20 separate
> sources then all feeding into a shared processing/distribution process
> group.  You can version control the sourcing logic and use 20 instances of
> the same version.  Each process group/sourcing flow could have its own
> variables.
>
> In NiFi 1.10.0 variables are effectively deprecated in favor of the more
> powerful mechanism known as 'Parameters'.  These will let you have
> sensitive values (which are protected) and you can basically apply a
> parameter to every field rather than just ones we've enabled EL for.  Very
> powerful stuff.
>
> Thanks
>
> On Tue, Oct 1, 2019 at 12:43 PM Wesley C. Dias de Oliveira <
> wcdoliveira@gmail.com> wrote:
>
>> I'm no so sure about the first one but you can config them using the same
>> datasource.
>>
>> Look:
>>
>> [image: image.png]
>>
>> Some of them use the same database, across the same connection.
>>
>> Em ter, 1 de out de 2019 às 13:31, John Fak <johnfak75@gmail.com>
>> escreveu:
>>
>>> Hi Wes.
>>>
>>> Yes - but will the processor for connecting to say Oracle allow a
>>> runtime variable ...... and how to pass that.
>>>
>>> I have only ever built 1 single flow (basic)  ...... at what point does
>>> it get hard to manage with many dataflows.
>>>
>>>
>>> On Tue, Oct 1, 2019 at 11:50 AM Wesley C. Dias de Oliveira <
>>> wcdoliveira@gmail.com> wrote:
>>>
>>>> Hi, John,
>>>>
>>>> I would say you should build 20 dataflows because it will probably be
>>>> more isolated and testable.
>>>>
>>>> About the password, maybe you can use the password as a variable, huh?
>>>>
>>>> Em ter, 1 de out de 2019 às 12:44, John Fak <johnfak75@gmail.com>
>>>> escreveu:
>>>>
>>>>> If you want a flow to replicate to say 20 databases and perform data
>>>>> action.
>>>>>
>>>>> 1) Can this be 1 flow (source) connected to 20 targets - or should it
>>>>> be separate flows (20).
>>>>>
>>>>> 2) If you need a password to be dynamic as it may change - and be
>>>>> differnt across 20 endpoints ..... can part of the flow execute a local
OS
>>>>> script to get the password to then use in the connection dynamically
- or
>>>>> can only be statistically defined.
>>>>>
>>>>>
>>>>> This would be to send certain data to ceytain databases based on a
>>>>> flag.
>>>>> But to also allow passwords to change without breaking flows.
>>>>>
>>>>> thx
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>
>>>> --
>>>> Grato,
>>>> Wesley C. Dias de Oliveira.
>>>>
>>>> Linux User nº 576838.
>>>>
>>>
>>
>> --
>> Grato,
>> Wesley C. Dias de Oliveira.
>>
>> Linux User nº 576838.
>>
>

Mime
View raw message