On 3. May 2017, at 12:36, Moiz S Jinia <email@example.com> wrote:The kind of program I intend to submit would be one that sets up a StreamExecutionEnvironment, connects to a stream from a Kafka topic, and uses a PatternStream over the kafka events. I could have the jar for this program readily available in the "jobmanager.web.upload.dir", and use the REST API for submitting a program with some configuration params.Does that sound like it'd work or am I missing something?MoizOn Wed, May 3, 2017 at 3:23 PM, Moiz S Jinia <firstname.lastname@example.org> wrote:Not sure I understand Operators. What I need is to have a Pattern that starts consuming from a Kafka stream. And I need the Patterns to come and go.
Another option that comes to mind is this -
The Patterns I'll need are well known in advance. Only certain parameters such as the time duration of the within clause, and maybe certain filter confitions of the where clause need tweaking. So I could pre-deploy the Patterns (or jobs) and start or stop them (with parameters).
Does that sound feasible?On Wed, May 3, 2017 at 3:15 PM, Aljoscha Krettek <email@example.com> wrote:What would the pattern be added to. An existing custom operator?The REST interface only allows for managing the lifecycle of a job, not modifying their graph structure.On 3. May 2017, at 11:43, Moiz S Jinia <firstname.lastname@example.org> wrote:Thanks for the references. Looking at the REST API, would adding new Patterns not work via this?On Wed, May 3, 2017 at 2:52 PM, Aljoscha Krettek <email@example.com> wrote:Hi,For managing a Job you can either use the bin/flink command-line tool or the Rest API . As for dynamically adding patterns, that’s outside of the scope of Flink right now. There are, however, some users that implemented this on top of Flink, see for example RBEA . The basic idea is to use a ConnectedStream where one input is the main input and the other input is a control stream that updates the existing patterns.On 3. May 2017, at 10:02, Moiz S Jinia <firstname.lastname@example.org> wrote:Is there an API that allows remotely adding, modifying, and cancelling Flink jobs? Example - changing the time window of a deployed Pattern, adding new Patterns, etc.Whats the best way to go about this? To the end user the Pattern would manifest as rules that can be updated anytime.Moiz