What Is Kafka Connect
Apache Kafka Connect is an ecosystem of pre-written and maintained Kafka Producers (source connectors) and Kafka Consumers (sink connectors) for various other data products and platforms like databases and message brokers. This allows you to easily build data pipelines without having to write and test your own consumers and producers.
There are two distinct types of connectors:
Source connectors:
As the name suggests, these connectors act as a source of data and publish messages onto Kafka topics. For example, you can use the PostgreSQL JDBC source connector to publish a message onto a topic every time a row gets added to a table. This would allow you to set off a chain of events when (for example) someone posts a new message or a new user is created.Sink connectors:
These connectors consume data from a Kafka topic and upload or insert that data onto a different data platform. For example, when someone makes a trade, you might want that event inserted into a time series database (like Timescale) for record-keeping and analytical purposes.
But the real benefit comes from the fact that you are not limited to a one-to-one relationship between producers and consumers. You can have multiple connectors act on the same message. Maybe your message is flowing between microservices, but you also want to store these messages in S3 and a data lake but also send it to another message broker. The sky's the limit when it comes to building pipelines using Apache Kafka Connect.
When I first started learning about Kafka and Kafka Connect, my biggest grievance was that there were almost no beginner-friendly end-to-end tutorials on properly setting up Kafka Connect for connectors that were more complicated than a local file sink. Because I had very limited Java experience, the ecosystem was quite daunting to wrap my head around, which made understanding and installing plugins all the more difficult.
My goal for this tutorial is to clearly explain every step to set up a JDBC Sink connector that streams data from a Kafka topic into a Timescale database without any guesswork. If you arenโt fond of blog posts and would rather just dive into the code, you can find the full shell script with all the necessary steps and commands here.
Kafka Connect Tutorial
This short tutorial will show you how to set up Kafka and Kafka Connect to stream data from a Kafka topic into a Timescale database.
Read the full article here: https://www.timescale.com/blog/building-a-kafka-data-pipeline-for-time-series-with-kafka-connect-and-timescale/