Pattern for safely updating/migrating schema and documents in production

Hi all!

I am currently trying to decide on the strategy regarding schema migration and document updates in my production database.

My setup: DB in Atlas is accessed via RealmApp with schema defined via JSON Schema. That schema is then used to generate a GraphQL endpoint, with it’s own GQL-schema, that is the interface my regular project is using to access the database.

My objective is to create a process where I would be able to update both the schema, as well as the particular values of the fields in a way that can be version controlled, is transparent enough so I can confirm that there was no unwanted corruption or data loss, and easy to roll back, in case I made a mistake. Additional points for simplicity, which would minimize human error.

The schema versioning pattern (Building with Patterns: The Schema Versioning Pattern | MongoDB Blog) doesn’t seem to be a good solution, since I need the same schema across all the documents to be able to describe them in JSON-schema and thus generate GraphQL schema.

Pattern I considered:

  1. Backup with mongoexport
  2. realm-cli export and realm-cli import for JSON-schema migrations - version controlled
  3. migrate-mongo to for updating documents in bulk - version controlled
  4. mongoexport after updating. Comparing with backup to confirm that the changes are what I expected
  5. Validate JSON-schema against the data in the database, using the validator provided by Realm.

This seems to be a pretty straightforward solution, but since I am not an experienced database developer, I am wondering if I am not missing anything. It also seems a bit over-complicated and I would assume that the MongoDB Atlas/Realm ecosystem would have some sort of easy answer to such a standard issue. I have not found it though, so I hope that some more experienced members of this community could share their thoughts and ideas with me.