Replicate data from MySQL, Postgres and MongoDB to ClickHouse
Go to file
2022-07-08 21:51:04 -04:00
.github/workflows Fixed image name in github action for building docker images. 2022-05-11 21:09:38 -04:00
deploy Added more test cases for data type mapper. 2022-06-28 22:27:59 -04:00
doc Added class to store data type mapping from kafka connect schema to Clickhouse schema, prep work for auto create tables. 2022-06-22 20:47:33 -04:00
docker Fixed imports. 2022-06-26 17:10:53 -04:00
src Changes to support alter table and create table. 2022-07-08 21:51:04 -04:00
tests Changes to support alter table and create table. 2022-07-08 21:51:04 -04:00
.gitignore Added unit test directory to pom.xml 2022-07-03 19:22:14 -04:00
LICENSE Initial commit 2022-03-21 11:32:45 +03:00
pom.xml Added unit test directory to pom.xml 2022-07-03 19:22:14 -04:00
README.md Updated list of features. 2022-06-29 15:48:01 -04:00
strimzi.yml Updated documentation with Data types mapping. 2022-04-29 13:32:53 -04:00

Altinity Sink Connector for ClickHouse

Sink connector sinks data from Kafka into Clickhouse. The connector is tested with the following converters

Features

  • Inserts, Updates and Deletes using ReplacingMergeTree/CollapsingMergeTree - Updates/Deletes
  • Deduplication logic to dedupe records from Kafka topic.(Based on Primary Key)
  • Exactly once semantics
  • Bulk insert to Clickhouse.
  • Store Kafka metadata Kafka Metadata
  • Kafka topic to ClickHouse table mapping, use case where MySQL table can be mapped to a different CH table name.
  • Store raw data in JSON(For Auditing purposes)
  • Monitoring(Using Grafana/Prometheus) Dashboard to monitor lag.
  • Kafka Offset management in ClickHouse
  • Increased Parallelism(Customize thread pool for JDBC connections)

Source Databases

  • MySQL (Debezium)
  • PostgreSQL (Debezium) (Testing in progress)

Documentation