kafka-users mailing list archives

Site index · List index
Message view « Date » · « Thread »
Top « Date » · « Thread »
From Alex Good <A...@makerlabs.co.uk>
Subject Kafka Connect JDBC Array Type
Date Sat, 11 Nov 2017 16:56:41 GMT
Hey All,

I'm using the JDBC sink to materialise a kafka topic in a postgres table.
The records in the topic are avro messages. I'm running into this problem
because one of the fields in the messages is an array:

org.apache.kafka.connect.errors.ConnectException: null (ARRAY) type doesn't
have a mapping to the SQL database column type

Now, that makes sense to me. I'm wondering if there is anyway to get this
data into the database with the built in JDBC connector or otherwise. I
have no problem with it just being a string representation of the array,
although I would prefer if I could use postgres native JSON or ARRAY
types. Anyone had any experience with this? The only real constraint is I
want to use upsert semantics, or at least, some method which doesn't result
in duplicates in the table.


  • Unnamed multipart/alternative (inline, None, 0 bytes)
View raw message