Trigger in Realm Timeout

Hi,

I have a trigger on Realm that calls an aggregation pipeline in a function.
It is meant to be a batch job & takes 3 to 5 minutes.

However, a function times out at 90 seconds it seems.
Is there any hint I can give the function to allow it to take more time before going timeout?

Thanks,
Jan

Hi @Jan_De_Vylder,

Triggers are limited to be executed in 90s max.

What I have done in some processes to overcome this problem is I used a " follow-up trigger" to continue the work in batches until completion using a workflow table while listening to its documents.

The main idea is I had:

  1. Trigger A which was triggered based on the business requirement (database, schedule, auth) trying to process whatever in 80s and then writing a document to the workflow collection.
  2. Then I had trigger B who would run over and over until task completed based on workflow collection updates ( passing information through changeEvent documents)

This way I was able to perform a task of 5-10 min.

Let me know if you have questions

Pavel

1 Like

Dear Pavel,

This seems an excellent idea!
I will work that out and tell you about the results.

Thanks,
Jan

Hey @Pavel_Duchovny and @Jan_De_Vylder,

Thanks sharing how to work around Realm trigger timeouts. I like the idea of a follow-up trigger but am struggling to wrap my head around implementing the solution. Would either of you two be willing to elaborate further with some details?

For example, I was reading the article below on using preimage triggers to cascade delete throughout my database. In the article, all relevant quest documents are deleted via the map method.

  deletedDocument.quests.map( async (quest) => {
     await quests.deleteOne({_id : quest.questId});
  })

I am specifically curious what I need to do if the map method was going to take over 90 seconds and require a follow-up trigger.

Cheers!

If you’re using another cloud service like AWS you could look to utilise something like Amazon EventBridge (Serverless Event Router – Amazon EventBridge – Amazon Web Services & https://docs.atlas.mongodb.com/triggers/eventbridge/) or Step functions (Serverless Workflow Orchestration – AWS Step Functions – Amazon Web Services)

We have done something like this to kick start lots of machine learning containers.

We essentially do the following…

  • MongoDB scheduled trigger starts
  • MongoDB Realm function gets all data for the job it needs from MongoDB database
  • Calls a Lambda function with the info of the job from realm function
  • Lambda function calls X step functions
  • Step functions call AWS Batch to do lots of grunt work and spin up containers
  • At the end of the step function, another Lambda is called to write results to MongoDB

All of this uses just 1 MongoDB Realm function which starts the whole chain of events off.

Thanks

1 Like

cheers, @Adam_Holt :clap:

Had to learn how to use and integrate AWS EventBridge (thanks for the motivation), but ultimately this works like a charm!

Good man!

I haven’t done it myself yet, so any tips would be great.

I just knew it was probably the right tool for the job!

Absolutely! Details on how to integrate are below…

Step 1) Follow the documentation for an EventBridge trigger. For the Realm Trigger I would make sure that the AWS Region is consistent between MongoDB and AWS. The documentation is pretty clear to set up an Event Bus.

Step 2) Establish your EventBridge rule in AWS.

  • Select Event Bus: aws.partner/mongodb.com/stich.trigger/[your_new_event_bus_id]
  • Create a new rule
  • Define Pattern:
    – Type: Event Pattern
    – Event Matching Pattern: Pre-Defined Pattern by Service
    – Service Provider: Service Partners → MongoDB (event pattern to the right of your screen should display your account ID as follows… { “account”: [“Account ID, Same as EventBridge Trigger AWS ID”] }
  • Select Event Bus
    – Custom or partner event bus: aws.partner/mongodb.com/stich.trigger/[your_new_event_bus_id]
    – Confirm that ‘enable the rule on the selected event bus’ is checked
  • Select Targets
    – Lamda function
    – yourLamdaFunction (setup described below) — you don’t need to configure anything else in this section
  1. Lambda Function Setup
  • I found this example helpful to get my bearings (should only take 5-10 minutes to learn the basics)
  • After following the tutorial above, make sure you upload .zip of your files (node_modules, index.js, package-lock.json, and package.json).
  • Update index.js to work the way you need it to. If possible, I would build a trigger function in realm, make sure it works, and then just replace it with the EventBridge, I found it easier to troubleshoot this way. As a baseline, you will need to do the following to interact with MongoDB
const MongoClient = require("mongodb").MongoClient;
const MONGODB_URI =
  "URI IS FROM 'CONNECT YOUR APPLICATION' SECTION IN ATLAS, YOU NEED TO UPDATE PASSWORD AND YOUR DATABASE NAME";
let cachedDb = null;
async function connectToDatabase() {
  if (cachedDb) {
    return cachedDb;
  }
  const client = await MongoClient.connect(MONGODB_URI);
  const db = await client.db('databaseName');
  cachedDb = db;
  return db
}

// This is important if you are working with ObjectIds (BSON.ObjectId() does not work here)
const ObjectId = require('mongodb').ObjectID;

// Note that ObjectIDs do not appear to be passed through to Lambda so need to convert here
exports.handler = async (event, context) => {
  // By default, the callback waits until the runtime event loop is empty before freezing the process and returning the results to the caller. Setting this property to false requests that AWS Lambda freeze the process soon after the callback is invoked, even if there are events in the event loop. AWS Lambda will freeze the process, any state data, and the events in the event loop. Any remaining events in the event loop are processed when the Lambda function is next invoked, if AWS Lambda chooses to use the frozen process.
  context.callbackWaitsForEmptyEventLoop = false;
  // Get an instance of our database
  const db = await connectToDatabase();
  const fullDocument = event.detail.fullDocument;

  // Get access to collections the same way
  const myCollection = db.collection("myCollection")
  // Notice that you may need to convert an ID to ObjectID even if it is stored as one in Atlas
  const currentCollectionId = await myCollection.findOne({ "_id": ObjectId(fullDocument.userId) })
  ...
}

If you integrate the MongoDB SDK correctly you should be able to access all the typical functionality you would expect in a function, whether that is find, insert, update, delete, etc.

One last comment, AWS Cloud Watch is your friend here. If you get stuck at any point I would recommend just using console.log at various points in the lambda function to see what is going wrong. In Cloud Watch → Logs → Log Groups you can find your logs for the lambda function. This helped me a lot.

Let me know if you need me to elaborate further!

3 Likes