Replicate data from MySQL, Postgres and MongoDB to ClickHouse
Go to file
Vladislav Klimenko 29f2454106 polish pipeline
2022-05-12 17:09:38 +03:00
.github/workflows Fixed image name in github action for building docker images. 2022-05-11 21:09:38 -04:00
deploy use localhost for altinity connectora 2022-05-12 16:56:49 +03:00
doc polish pipeline 2022-05-12 17:09:38 +03:00
docker make image names for readbale 2022-05-11 15:03:51 +03:00
src Fixed bug with kafka metadata columns not added when there is a source, CH columns mismatch. Added unittest for QueryFormatter class. 2022-05-10 15:59:53 -04:00
tests Enable transactions metadata support in config. 2022-05-10 17:54:28 -04:00
.gitignore Added initial commit of files from the original repo 2022-03-28 09:16:31 -04:00
LICENSE Initial commit 2022-03-21 11:32:45 +03:00
pom.xml Added logic to record min and max offset of the bulk message in prometheus. 2022-05-06 09:57:30 -04:00
README.md Added more source kafka metadata columns(binlog pos, server id, thread, source ts_ms). 2022-05-10 12:58:54 -04:00
strimzi.yml Updated documentation with Data types mapping. 2022-04-29 13:32:53 -04:00

ClickHouse Sink Connector

Sink connector sinks data from Kafka into Clickhouse. The connector is tested with the following converters

  • JsonConverter

  • AvroConverter (Using Apicurio Schema Registry)

  • Currently the connector only supports Insert operations.

  • Deduplication logic to dedupe records from Kafka topic.

  • Exactly once semantics

  • Bulk insert to Clickhouse.

  • Store Kafka metadata Kafka Metadata

Documentation