kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Jan Filipiak <Jan.Filip...@trivago.com>
Subject Re: Kafka Connect Sink Connector for multiple JDBC sinks
Date Sat, 16 Sep 2017 21:07:41 GMT
Hi,

entirely depends on how you want to serialize. You should be able to get 
everything running on Windows anyhow. Nothing expect the broker is 
really extensively using OS support for operating.

To answer your initial question: You would simply start multiple sinks 
and give each sink a different connect string. That should do what you 
want instantly

Best Jan

On 16.09.2017 22:51, M. Manna wrote:
> Yes I have, I do need to build and run Schema Registry as a pre-requisite
> isn't that correct? because the QuickStart seems to start AVRO - without
> AVRO you need your own implementation of transformer/serdes etc.
>
> I am only asking since my deployment platform is Windows Server 2012 - and
> Confluent pkg is meant to be run on Linux. I guess there is a lot of manual
> conversion I need to do here?
>
> On 16 September 2017 at 21:43, Ted Yu <yuzhihong@gmail.com> wrote:
>
>> Have you looked at https://github.com/confluentinc/kafka-connect-jdbc ?
>>
>> On Sat, Sep 16, 2017 at 1:39 PM, M. Manna <manmedia@gmail.com> wrote:
>>
>>> Sure. But all these are not available via Kafka open source (requires
>>> manual coding), correct? Only Confluence seems to provide some
>>> off-the-shelf connector but Confluent isn't compatible on Windows (yet),
>>> also correct?
>>>
>>>
>>>
>>> On 13 September 2017 at 18:11, Sreejith S <srssreejith@gmail.com> wrote:
>>>
>>>> This is possible. Once you have records in your put method, its up your
>>>> logic how you are redirecting it to multiple jdbc connections for
>>>> insertion.
>>>>
>>>> In my use case i have implemented many to many sources and sinks.
>>>>
>>>> Regards,
>>>> Srijith
>>>>
>>>> On 13-Sep-2017 10:14 pm, "M. Manna" <manmedia@gmail.com> wrote:
>>>>
>>>> Hi,
>>>>
>>>> I need a little help/suggestion if possible. Does anyone know if it's
>>>> possible in Kafka to develop a connector that can sink for multiple
>> JDBC
>>>> urls for the same topic (i.e. table) ?
>>>>
>>>> The examples I can see on Confluent talks about one JDBC url
>> (one-to-one
>>>> sink). Would it be possible to achieve a one-to-many ?
>>>>
>>>> What I am trying to do is the following:
>>>>
>>>> 1) Write to a topic
>>>> 2) Sink it to multiple DBs (they all will have the same table).
>>>>
>>>> Is this doable/correct way for Connect API?
>>>>
>>>> Kindest Regards,
>>>>


Mime
View raw message