kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Mikael Petterson <mikael.petter...@ericsson.com>
Subject RE: Messgages from different sites
Date Mon, 10 Dec 2018 07:46:00 GMT
Hi Ryanne, 

Thanks for your reply.

I will definitely take a look at connectors. 

Actually it is not a lot of data but it is collected using an aspect ( using aspectj) at runtime.
Collected data is stored in a simple data structure.
Then when we received an event from TestNG ,void onExecutionFinish(), we wrote the complete
data structure to a *.json file.

Maybe it is possible to send data at runtime ( if it does not slow down performance) instead
using connector. My concern is mainly about:

- Send data to a central point that can be accessible from all sites.
- Sending time should be minimal.
- Sent data should be grouped ( errors, access, environment) for each execution.

Br,

//mike

We will use 
-----Original Message-----
From: Ryanne Dolan <ryannedolan@gmail.com> 
Sent: den 7 december 2018 15:42
To: Kafka Users <users@kafka.apache.org>
Subject: Re: Messgages from different sites

Mike, a couple things you might take a look at:

Kafka Connect might be useful for writing to your DB. Your
consumer->producer->DB flow seems to fit well with Connect.

Transactions might be useful for ensuring that your application runs to completion before
committing records. It sounds like your application outputs a json file, perhaps after doing
something that takes a long time, and you are worried about sending Kafka messages until the
entire json file is written out. With transactions, you might be able to send messages while
the application is running and then close the transaction at the end (instead or writing to
a file).

That said, I've seen scenarios where large files are stored in a data store somewhere, and
then Kafka is used to pass around the location of the files instead of the files themselves.
If this is your scenario, it can be a good model when files are large, since Kafka is sorta
inherently bad at dealing with large messages. For example, this model is common with video
processing pipelines.

Ryanne

On Fri, Dec 7, 2018, 5:55 AM Mikael Petterson <mikael.petterson@ericsson.com
wrote:

> Hi,
> We have one application that produces various data ( dataset1, dataset 
> 2
> ....) at various sites when our application is executed. Currently 
> dataset1, dataset2 ..... is stored into separate *.json data files for 
> each execution.
>
> We don't want to transfer the data when we are running the application.
> The reason for it is that
> application might not be able to connect to database, for various reasons.
> We want the producer of data to notify consumer ( just a central one 
> if
> possible) that it has data and where data is located. Then consumer get it.
> Or is there a better way..... Consumer will then push the data for one 
> execution to the db.
>
> Our application can run at various sites.
>
>
> Site 1 Producer1  -------> Consumer1 ---> db Site 2 Producer1 -------> 
> Consumer1  ---> db Site 2 Producer2 -------> Consumer1  ---> db Site 3 
> Producer1 -------> Consumer1  ---> db ...
> Site x Producerx --------> Consumer1 ---> db
>
> I need user input if someone else out there has used Kafka in this 
> way? Is it recommended? Or is there alternatives?
>
> Br
>
> //mike
>
Mime
View raw message