I want to implement Outbox Pattern in our microservices using Mongo-Kafka connector, in my outbox(a MongoDB collection) I store topic data using these fields: kafka_topic, kafka_key, kafka_value, kafka_header_keys, kafka_header_values.
But how should I config mongo-kafka connector to dynamically choose the topic from the kafka_topic field and also its values and headers from other outbox fields?
I can not find settings to do this from its config reference.
I’m also thinking about getting a fork from mongo-kafka connector repository and extend it with my custom implementation of source class or any other way.
I would greatly appreciate it if you could help me.
I’m still a little unclear if MongoDB is the Sink or the Source in this scenario? So could you also add some more detail, so I can ensure I have the correct mental model of the outbox pattern.
I implemented this pattern through Atlas Mongo DB Source Connector with Kafka. Because the Mongo-Kafka Connector on Confluent Cloud does not support mapping specific documents to different topics, I ended up creating a collection for each event that I have.
So, collection “event1” has a corresponding topic “event1” in Kafka. I use the Change Stream functionality to relay information to the topics. Once a new document is inserted into “event1” collection (through a transaction, hence the Outbox Pattern), the same document is relay to the “event1” Kafka topic, thanks to the Change Stream. This ensures an at-least-once delivery of messages.
I have to say, this approach works when you have a limited set of events. Managing them is also not that easy. But at least this is a workaround.
Glad you were able to find a work around. Just to let you know in the upcoming 1.4.0 release you will be able to map SinkRecords to collections and change stream documents to topics based on the content of the data. I hope that should help out more.
Also just to remind users please post feature requests to our Jira project. We always welcome feedback and want to ensure the connector meets our users needs
I got a hard fork from the Mongo-kafka-connector project and implemented the outbox pattern.
We have a collection named outbox in our DB with these fields:
# outbox collection fields:
id string
topic string
key string # The kafka partitioning key
value string # The event value
headers []Header # event headers
emitted_at time.Time
Now we have full control over our event’s topic, value, partitioning key, and its headers.
I’ll push it to our public git repo and post its link at this topic.
I would be really glad if you could help me to maintain it as outbox pattern implementation for MongoDB and Kafka or maybe we could merge it into the Kafka connector.