Kafka Connect Source - read/load entire collection upon new connector created

Hi, the documentation explains clearly
“Data is captured via Change Streams within the MongoDB cluster and published into Kafka topics”

Question: Upon creating a new connector - is it possible to load entire collections into Kafka?
Or will it only ever be possible to begin with changes?

If above would be a good or bad idea to remain undecided.
(But for the data volumes I’m looking at for my current use case it might be feasible…)

I tried to stimulate / trigger a ‘full refresh’ manually but didn’t succeed.

db.getCollection("mycollection").aggregate([{ $out: 'mycollection_copy' }]);
db.getCollection("mycollection").drop();
db.getCollection("mycollection_copy").aggregate([{ $out: 'mycollection' }]);

Both with & without the ‘drop’ in-between the aggregate ‘copyTo’ documents are actually copied / replaced. But the change stream does not seem to be triggered, so it’s not working for my purpose.

Any tips or ideas?

copy.existing parameter might help you

https://docs.mongodb.com/kafka-connector/master/kafka-source/#source-connector-configuration-properties

1 Like