Replicate data from MySQL, Postgres and MongoDB to ClickHouse
Go to file
2022-08-10 21:07:29 -04:00
.github/workflows Fixed image name in github action for building docker images. 2022-05-11 21:09:38 -04:00
deploy Removed links and replace with depends_on. Fix SINK_VERSION error. 2022-08-10 21:07:29 -04:00
doc Updated README instructions. 2022-08-02 17:46:37 -04:00
docker Fixed grafana dashboard auto loading. 2022-08-01 15:09:53 -04:00
src Fixed logic of retry for auto creating table 2022-08-02 17:41:38 -04:00
tests Added integration tests to compare count of seed data and sysbench tables. 2022-07-22 19:01:19 -04:00
.gitignore Added unit test directory to pom.xml 2022-07-03 19:22:14 -04:00
LICENSE Initial commit 2022-03-21 11:32:45 +03:00
pom.xml Support for JSON data type. 2022-08-02 10:51:13 -04:00
README.md Fixed logic of retry for auto creating table 2022-08-02 17:41:38 -04:00
strimzi.yml Updated documentation with Data types mapping. 2022-04-29 13:32:53 -04:00

Altinity Sink Connector for ClickHouse

Sink connector sinks data from Kafka into Clickhouse. The connector is tested with the following converters

Features

  • Inserts, Updates and Deletes using ReplacingMergeTree/CollapsingMergeTree - Updates/Deletes
  • Deduplication logic to dedupe records from Kafka topic.(Based on Primary Key)
  • Exactly once semantics
  • Bulk insert to Clickhouse.
  • Store Kafka metadata Kafka Metadata
  • Kafka topic to ClickHouse table mapping, use case where MySQL table can be mapped to a different CH table name.
  • Store raw data in JSON(For Auditing purposes)
  • Monitoring(Using Grafana/Prometheus) Dashboard to monitor lag.
  • Kafka Offset management in ClickHouse
  • Increased Parallelism(Customize thread pool for JDBC connections)

Grafana Dashboard

\

Source Databases

  • MySQL (Debezium)
  • PostgreSQL (Debezium) (Testing in progress)

Documentation