mirror of
https://github.com/valitydev/clickhouse-sink-connector.git
synced 2024-11-06 10:35:21 +00:00
Replicate data from MySQL, Postgres and MongoDB to ClickHouse
.github/workflows | ||
deploy | ||
doc | ||
docker | ||
src | ||
tests | ||
.gitignore | ||
LICENSE | ||
pom.xml | ||
README.md | ||
strimzi.yml |
Altinity Sink Connector for ClickHouse
Sink connector sinks data from Kafka into Clickhouse. The connector is tested with the following converters
- JsonConverter
- AvroConverter (Using Apicurio Schema Registry)
Features
- Inserts, Updates and Deletes using ReplacingMergeTree/CollapsingMergeTree - Updates/Deletes
- Deduplication logic to dedupe records from Kafka topic.(Based on Primary Key)
- Exactly once semantics
- Bulk insert to Clickhouse.
- Store Kafka metadata Kafka Metadata
- Kafka topic to ClickHouse table mapping, use case where MySQL table can be mapped to a different CH table name.
- Store raw data in JSON(For Auditing purposes)
- Monitoring(Using Grafana/Prometheus) Dashboard to monitor lag.
- Kafka Offset management in ClickHouse
- Increased Parallelism(Customize thread pool for JDBC connections)
Source Databases
- MySQL (Debezium)
- PostgreSQL (Debezium) (Testing in progress)