At the risk of diluting the brand message (i.e. testing kafka stuff using Clojure), in this post, I’m going to introduce some code for extracting a report on the status of Kafka Connect jobs. I’d argue it’s still “on-message”, falling as it does under the observability/metrics umbrella and since observability is an integral part of testing in production then I think we’re on safe ground.
In A Test Helper for JDBC Sinks one part of the testing process that I glossed over a bit was the line “Generate some example records to load into the input topic”. I said this like it was no big deal but actually there are a few moving parts that all need to come together for this to work and it’s something I struggled to get to grips with at the beginning of our journey and have seen other experienced engineers struggle with too. Part of the problem I think is that a lot of the Kafka eco-system is made up of folks using statically typed languages like Scala, Kotlin etc. It does all work with dynamically typed languages like Clojure but there are just fewer of us around which makes it all the more important to share what we learn. So here’s a quick guide to generating test-data and getting it into Kafka using the test-machine from Jackdaw
The Confluent JDBC Sink allows you to configure Kafka Connect to take care of moving data reliably from Kafka to a relational database. Most of the usual suspects (e.g. PostgreSQL, MySQL, Oracle etc) are supported out the box and in theory, you could connect your data to any database with a JDBC driver.
In Testing Event Driven Systems, I introduced the test-machine, (a Clojure library for testing kafka applications) and included a simple example for demonstration purposes. I made the claim that however your system is implemented, as long as its input and output can be represented in Kafka, the test-machine would be an effective tool for testing it. Now we’ve had some time to put that claim to the…ahem test, I thought it might be interesting to explore some actual use-cases in a bit more detail.