Replicate data from MySQL, Postgres and MongoDB to ClickHouse
Go to file
Vladislav Klimenko fd67311834 doc: pipeline
2022-05-06 15:13:16 +03:00
.github/workflows Fix steps in github workflow 2022-04-29 10:22:15 -04:00
deploy doc: pipeline 2022-05-06 15:13:16 +03:00
doc Added unit test case and logic to limit the Date values to clickhouse supported date ranges. 2022-05-05 11:09:05 -04:00
docker use strimzi-recommended build processes 2022-05-06 14:07:41 +03:00
src Added logic to check if the column data type is DateTime64 for limiting the range. 2022-05-05 18:01:20 -04:00
tests Added logic to use micrometer library and expose prometheus port for writing metrics. Added test case for deduplication. 2022-05-03 22:26:57 -04:00
.gitignore Added initial commit of files from the original repo 2022-03-28 09:16:31 -04:00
LICENSE Initial commit 2022-03-21 11:32:45 +03:00
pom.xml Added logic to check if the column data type is DateTime64 for limiting the range. 2022-05-05 18:01:20 -04:00
README.md Updated documentation with Data types mapping. 2022-04-29 13:32:53 -04:00
strimzi.yml Updated documentation with Data types mapping. 2022-04-29 13:32:53 -04:00

ClickHouse Sink Connector

Sink connector sinks data from Kafka into Clickhouse. The connector is tested with the following converters

Currently the connector only supports Insert operations.

Documentation