Replicate data from MySQL, Postgres and MongoDB to ClickHouse
Go to file
2022-05-05 12:57:10 +03:00
.github/workflows Fix steps in github workflow 2022-04-29 10:22:15 -04:00
deploy extract avro from libs 2022-05-05 12:57:10 +03:00
doc Updated Sink configuration documentation. 2022-05-04 14:52:27 -04:00
docker add k8s pipeline 2022-05-04 19:10:14 +03:00
src unify connecto manifests and add some logging 2022-05-05 12:10:03 +03:00
tests Added logic to use micrometer library and expose prometheus port for writing metrics. Added test case for deduplication. 2022-05-03 22:26:57 -04:00
.gitignore Added initial commit of files from the original repo 2022-03-28 09:16:31 -04:00
LICENSE Initial commit 2022-03-21 11:32:45 +03:00
pom.xml Added logic to use micrometer library and expose prometheus port for writing metrics. Added test case for deduplication. 2022-05-03 22:26:57 -04:00
README.md Updated documentation with Data types mapping. 2022-04-29 13:32:53 -04:00
strimzi.yml Updated documentation with Data types mapping. 2022-04-29 13:32:53 -04:00

ClickHouse Sink Connector

Sink connector sinks data from Kafka into Clickhouse. The connector is tested with the following converters

Currently the connector only supports Insert operations.

Documentation