Learn, develop, and innovate from anywhere. Join us for our MongoDB .Live series.
HomeLearnQuickstart

Change Streams & Triggers with Node.js Tutorial

Updated: Jul 14, 2020 |

Published: Jan 08, 2020

  • MongoDB
  • Atlas
  • JavaScript
  • ...

By Lauren Schaefer

Share
QuickStart Node.js Logo

Sometimes you need to react immediately to changes in your database. Perhaps you want to place an order with a distributor whenever an item's inventory drops below a given threshold. Or perhaps you want to send an email notification whenever the status of an order changes. Regardless of your particular use case, whenever you want to react immediately to changes in your MongoDB database, change streams and triggers are fantastic options.

If you're just joining us in this Quick Start with MongoDB and Node.js series, welcome! We began by walking through how to connect to MongoDB and perform each of the CRUD (create, read, update, and delete) Operations. Then we jumped into more advanced topics like the aggregation framework and transactions. The code we write today will use the same structure as the code we built in the first post in the series, so, if you have any questions about how to get started or how the code is structured, head back to that post.

And, with that, let's dive into change streams and triggers! Here is a summary of what we'll cover today:

Get started with an M0 cluster on Atlas today. It's free forever, and it's the easiest way to try out the steps in this blog series.

#What are Change Streams?

Change streams allow you to receive notifications about changes made to your MongoDB databases and collections. When you use change streams, you can choose to program actions that will be automatically taken whenever a change event occurs.

Change streams utilize the aggregation framework, so you can choose to filter for specific change events or transform the change event documents.

For example, let's say I want to be notified whenever a new listing in the Sydney, Australia market is added to the listingsAndReviews collection. I could create a change stream that monitors the listingsAndReviews collection and use an aggregation pipeline to match on the listings I'm interested in.

Let's take a look at three different ways to implement this change stream.

#Set Up

As with all posts in this MongoDB and Node.js Quick Start series, you'll need to ensure you've completed the prerequisite steps outlined in the Set up section of the first post in this series.

I find it helpful to have a script that will generate sample data when I'm testing change streams. To help you quickly generate sample data, I wrote changeStreamsTestData.js. Download a copy of the file, update the uri constant to reflect your Atlas connection info, and run it by executing node changeStreamsTestData.js. The script will do the following:

  1. Create 3 new listings (Opera House Views, Private room in London, and Beautiful Beach House)
  2. Update 2 of those listings (Opera House Views and Beautiful Beach House)
  3. Create 2 more listings (Italian Villa and Sydney Harbour Home)
  4. Delete a listing (Sydney Harbour Home).

#Create a Change Stream

Now that we're set up, let's explore three different ways to work with a change stream in Node.js.

#Get a Copy of the Node.js Template

To make following along with this blog post easier, I've created a starter template for a Node.js script that accesses an Atlas cluster.

  1. Download a copy of template.js.
  2. Open template.js in your favorite code editor.
  3. Update the Connection URI to point to your Atlas cluster. If you're not sure how to do that, refer back to the first post in this series.
  4. Save the file as changeStreams.js.

You can run this file by executing node changeStreams.js in your shell. At this point, the file simply opens and closes a connection to your Atlas cluster, so no output is expected. If you see DeprecationWarnings, you can ignore them for the purposes of this post.

#Create a Helper Function to Close the Change Stream

Regardless of how we monitor changes in our change stream, we will want to close the change stream after a certain amount of time. Let's create a helper function to do just that.

  1. Paste the following function in changeStreams.js.

    1
    2
    3
    4
    5
    6
    7
    8
    9
    function closeChangeStream(timeInMs = 60000, changeStream) { return new Promise((resolve) => { setTimeout(() => { console.log("Closing the change stream"); changeStream.close(); resolve(); }, timeInMs) }) };

#Monitor Change Stream using EventEmitter's on()

The MongoDB Node.js Driver's ChangeStream class inherits from the Node Built-in class EventEmitter. As a result, we can use EventEmitter's on() function to add a listener function that will be called whenever a change occurs in the change stream.

#Create the Function

Let's create a function that will monitor changes in the change stream using EventEmitter's on().

  1. Continuing to work in changeStreams.js, create an asynchronous function named monitorListingsUsingEventEmitter. The function should have the following parameters: a connected MongoClient, a time in ms that indicates how long the change stream should be monitored, and an aggregation pipeline that the change stream will use.

    1
    2
    3
    async function monitorListingsUsingEventEmitter(client, timeInMs = 60000, pipeline = []){ }
  2. Now we need to access the collection we will monitor for changes. Add the following code to monitorListingsUsingEventEmitter().

    1
    const collection = client.db("sample_airbnb").collection("listingsAndReviews");
  3. Now we are ready to create our change stream. We can do so by using Collection 's watch(). Add the following line beneath the existing code in monitorListingsUsingEventEmitter().

    1
    const changeStream = collection.watch(pipeline);
  4. Once we have our change stream, we can add a listener to it. Let's log each change event in the console. Add the following line beneath the existing code in monitorListingsUsingEventEmitter().

    1
    2
    3
    changeStream.on('change', (next) => { console.log(next); });
  5. We could choose to leave the change stream open indefinitely. Instead, let's call our helper function to set a timer and close the change stream. Add the following line beneath the existing code in monitorListingsUsingEventEmitter().

    1
    await closeChangeStream(timeInMs, changeStream);

#Call the Function

Now that we've implemented our function, let's call it!

  1. Inside of main() beneath the comment that says Make the appropriate DB calls, call your monitorListingsUsingEventEmitter() function:

    1
    await monitorListingsUsingEventEmitter(client);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 60 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Output similar to the following will be displayed in your first shell where you are running changeStreams.js.

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    53
    54
    55
    56
    57
    58
    59
    60
    61
    62
    63
    64
    65
    66
    67
    68
    69
    70
    71
    72
    73
    74
    75
    76
    77
    78
    79
    80
    81
    82
    83
    84
    85
    86
    87
    88
    89
    90
    91
    92
    93
    94
    95
    96
    97
    98
    99
    100
    101
    102
    103
    104
    { _id: { _data: '825DE67A42000000012B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7640004' }, operationType: 'insert', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 1, high_: 1575385666 }, fullDocument: { _id: 5de67a42113ea7de6472e764, name: 'Opera House Views', summary: 'Beautiful apartment with views of the iconic Sydney Opera House', property_type: 'Apartment', bedrooms: 1, bathrooms: 1, beds: 1, address: { market: 'Sydney', country: 'Australia' } }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e764 } } { _id: { _data: '825DE67A42000000022B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7650004' }, operationType: 'insert', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 2, high_: 1575385666 }, fullDocument: { _id: 5de67a42113ea7de6472e765, name: 'Private room in London', property_type: 'Apartment', bedrooms: 1, bathroom: 1 }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e765 } } { _id: { _data: '825DE67A42000000032B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7660004' }, operationType: 'insert', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 3, high_: 1575385666 }, fullDocument: { _id: 5de67a42113ea7de6472e766, name: 'Beautiful Beach House', summary: 'Enjoy relaxed beach living in this house with a private beach', bedrooms: 4, bathrooms: 2.5, beds: 7, last_review: 2019-12-03T15:07:46.730Z }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e766 } } { _id: { _data: '825DE67A42000000042B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7640004' }, operationType: 'update', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 4, high_: 1575385666 }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e764 }, updateDescription: { updatedFields: { beds: 2 }, removedFields: [] } } { _id: { _data: '825DE67A42000000052B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7660004' }, operationType: 'update', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 5, high_: 1575385666 }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e766 }, updateDescription: { updatedFields: { address: [Object] }, removedFields: [] } } { _id: { _data: '825DE67A42000000062B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7670004' }, operationType: 'insert', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 6, high_: 1575385666 }, fullDocument: { _id: 5de67a42113ea7de6472e767, name: 'Italian Villa', property_type: 'Entire home/apt', bedrooms: 6, bathrooms: 4, address: { market: 'Cinque Terre', country: 'Italy' } }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e767 } } { _id: { _data: '825DE67A42000000072B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7680004' }, operationType: 'insert', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 7, high_: 1575385666 }, fullDocument: { _id: 5de67a42113ea7de6472e768, name: 'Sydney Harbour Home', bedrooms: 4, bathrooms: 2.5, address: { market: 'Sydney', country: 'Australia' } }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e768 } } { _id: { _data: '825DE67A42000000082B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67A42113EA7DE6472E7680004' }, operationType: 'delete', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 8, high_: 1575385666 }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67a42113ea7de6472e768 } }

    If you run the node changeStreamsTestData.js again before the 60 second timer has completed, you will see similar output.

    After 60 seconds, the following will be displayed:

    1
    Closing the change stream

#Call the Function with an Aggregation Pipeline

In some cases, you will not care about all change events that occur in a collection. Instead, you will want to limit what changes you are monitoring. You can use an aggregation pipeline to filter the changes or transform the change stream event documents.

In our case, we only care about new listings in the Sydney, Australia market. Let's create an aggregation pipeline to filter for only those changes in the listingsAndReviews collection.

To learn more about what aggregation pipeline stages can be used with change streams, see the official change streams documentation.

  1. Inside of main() and above your existing call to monitorListingsUsingEventEmitter(), create an aggregation pipeline:

    1
    2
    3
    4
    5
    6
    7
    8
    9
    const pipeline = [ { '$match': { 'operationType': 'insert', 'fullDocument.address.country': 'Australia', 'fullDocument.address.market': 'Sydney' }, } ];
  2. Let's use this pipeline to filter the changes in our change stream. Update your existing call to monitorListingsUsingEventEmitter() to only leave the change stream open for 30 seconds and use the pipeline.

    1
    await monitorListingsUsingEventEmitter(client, 30000, pipeline);
  3. Save your file.
  4. Run your script by executing node changeStreams.js in your shell. The change stream will open for 30 seconds.
  5. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Because the change stream is using the pipeline you just created, only documents inserted into the listingsAndReviews collection that are in the Sydney, Australia market will be in the change stream. Output similar to the following will be displayed in your first shell where you are running changeStreams.js.

    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    { _id: { _data: '825DE67CED000000012B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67CED150EA2DF172344370004' }, operationType: 'insert', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 1, high_: 1575386349 }, fullDocument: { _id: 5de67ced150ea2df17234437, name: 'Opera House Views', summary: 'Beautiful apartment with views of the iconic Sydney Opera House', property_type: 'Apartment', bedrooms: 1, bathrooms: 1, beds: 1, address: { market: 'Sydney', country: 'Australia' } }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67ced150ea2df17234437 } } { _id: { _data: '825DE67CEE000000032B022C0100296E5A10046BBC1C6A9CBB4B6E9CA9447925E693EF46645F696400645DE67CEE150EA2DF1723443B0004' }, operationType: 'insert', clusterTime: Timestamp { _bsontype: 'Timestamp', low_: 3, high_: 1575386350 }, fullDocument: { _id: 5de67cee150ea2df1723443b, name: 'Sydney Harbour Home', bedrooms: 4, bathrooms: 2.5, address: { market: 'Sydney', country: 'Australia' } }, ns: { db: 'sample_airbnb', coll: 'listingsAndReviews' }, documentKey: { _id: 5de67cee150ea2df1723443b } }

    After 30 seconds, the following will be displayed:

    1
    Closing the change stream

#Monitor Change Stream using ChangeStream's hasNext()

In the section above we used EventEmitter's on() to monitor the change stream. Alternatively, we can create a while loop that waits for the next element in the change stream by using hasNext() from MongoDB Node.js Driver's ChangeStream class.

#Create the Function

Let's create a function that will monitor changes in the change stream using ChangeStream's hasNext().

  1. Continuing to work in changeStreams.js, create an asynchronous function named monitorListingsUsingHasNext. The function should have the following parameters: a connected MongoClient, a time in ms that indicates how long the change stream should be monitored, and an aggregation pipeline that the change stream will use.

    1
    2
    3
    async function monitorListingsUsingHasNext(client, timeToWatch = 60000, pipeline = []) { }
  2. Now we need to access the collection we will monitor for changes. Add the following code to monitorListingsUsingHasNext().

    1
    const collection = client.db("sample_airbnb").collection("listingsAndReviews");
  3. Now we are ready to create our change stream. We can do so by using Collection 's watch(). Add the following line beneath the existing code in monitorListingsUsingHasNext().

    1
    const changeStream = collection.watch(pipeline);
  4. We could choose to leave the change stream open indefinitely. Instead, let's call our helper function that will set a timer and close the change stream. Add the following line beneath the existing code in monitorListingsUsingHasNext().

    1
    closeChangeStream(timeInMs, changeStream);
  5. Now let's create a while loop that will wait for new changes in the change stream. We can use ChangeStream's hasNext() inside of the while loop. hasNext() will return false as soon as the change stream is closed. hasNext() will wait to return true until a new change arrives in the change stream. Add the following line beneath the existing code in monitorListingsUsingHasNext().

    1
    2
    3
    while (await changeStream.hasNext()) { console.log(await changeStream.next()); }

#Call the Function

Now that we've implemented our function, let's call it!

  1. Inside of main(), replace your existing call to monitorListingsUsingEventEmitter() with a call to your new monitorListingsUsingHasNext():

    1
    await monitorListingsUsingHasNext(client);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 60 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Output similar to what we saw earlier will be displayed in your first shell where you are running changeStreams.js. If you run node changeStreamsTestData.js again before the 60 second timer has completed, you will see similar output again. After 60 seconds, the following will be displayed:

    1
    Closing the change stream

#Call the Function with an Aggregation Pipeline

As we discussed earlier, sometimes you will want to use an aggregation pipeline to filter the changes in your change stream or transform the change stream event documents. Let's pass the aggregation pipeline we created in an earlier section to our new function.

  1. Update your existing call to monitorListingsUsingHasNext() to only leave the change stream open for 30 seconds and use the aggregation pipeline.

    1
    await monitorListingsUsingHasNext(client, 30000, pipeline);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 30 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Because the change stream is using the pipeline you just created, only documents inserted into the listingsAndReviews collection that are in the Sydney, Australia market will be in the change stream. Output similar to what we saw earlier while using a change stream with an aggregation pipeline will be displayed in your first shell where you are running changeStreams.js. After 30 seconds, the following will be displayed:

    1
    Closing the change stream

#Monitor Changes Stream using the Stream API

In the previous two sections, we used EventEmitter's on() and ChangeStreams's hasNext() to monitor changes. Let's examine a third way to monitor a change stream: using Node's Stream API.

#Load the Stream Module

In order to use the Stream module, we will need to load it.

  1. Continuing to work in changeStreams.js, load the Stream module at the top of the file.

    1
    const stream = require('stream');

#Create the Function

Let's create a function that will monitor changes in the change stream using the Stream API.

  1. Continuing to work in changeStreams.js, create an asynchronous function named monitorListingsUsingStreamAPI. The function should have the following parameters: a connected MongoClient, a time in ms that indicates how long the change stream should be monitored, and an aggregation pipeline that the change stream will use.

    1
    async function monitorListingsUsingStreamAPI(client, timeInMs = 60000, pipeline = []) { }
  2. Now we need to access the collection we will monitor for changes. Add the following code to monitorListingsUsingStreamAPI().

    1
    const collection = client.db("sample_airbnb").collection("listingsAndReviews");
  3. Now we are ready to create our change stream. We can do so by using Collection 's watch(). Add the following line beneath the existing code in monitorListingsUsingStreamAPI().

    1
    const changeStream = collection.watch(pipeline);
  4. Now we're ready to monitor our change stream. We'll use ChangeStream's pipe() to pull the data out of a readable stream and write it to the console.

    1
    2
    3
    4
    5
    6
    7
    8
    9
    changeStream.pipe( new stream.Writable({ objectMode: true, write: function (doc, _, cb) { console.log(doc); cb(); } }) );
  5. We could choose to leave the change stream open indefinitely. Instead, let's call our helper function that will set a timer and close the change stream. Add the following line beneath the existing code in monitorListingsUsingStreamAPI().

    1
    await closeChangeStream(timeInMs, changeStream);

#Call the Function

Now that we've implemented our function, let's call it!

  1. Inside of main(), replace your existing call to monitorListingsUsingHasNext() with a call to your new monitorListingsUsingStreamAPI():

    1
    await monitorListingsUsingStreamAPI(client);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 60 seconds.
  4. Output similar to what we saw earlier will be displayed in your first shell where you are running changeStreams.js. If you run node changeStreamsTestData.js again before the 60 second timer has completed, you will see similar output again. After 60 seconds, the following will be displayed:

    1
    Closing the change stream

#Call the Function with an Aggregation Pipeline

As we discussed earlier, sometimes you will want to use an aggregation pipeline to filter the changes in your change stream or transform the change stream event documents. Let's pass the aggregation pipeline we created in an earlier section to our new function.

  1. Update your existing call to monitorListingsUsingStreamAPI() to only leave the change stream open for 30 seconds and use the aggregation pipeline.

    1
    await monitorListingsUsingHasNext(client, 30000, pipeline);
  2. Save your file.
  3. Run your script by executing node changeStreams.js in your shell. The change stream will open for 30 seconds.
  4. Create and update sample data by executing node changeStreamsTestData.js in a new shell. Because the change stream is using the pipeline you just created, only documents inserted into the listingsAndReviews collection that are in the Sydney, Australia market will be in the change stream. Output similar to what we saw earlier while using a change stream with an aggregation pipeline will be displayed in your first shell where you are running changeStreams.js. After 30 seconds, the following will be displayed:

    1
    Closing the change stream

#Resume a Change Stream

At some point, your application will likely lose the connection to the change stream. Perhaps a network error will occur and a connection between the application and the database will be dropped. Or perhaps your application will crash and need to be restarted (but you're a 10x developer and that would never happen to you, right?).

In those cases, you may want to resume the change stream where you previously left off so you don't lose any of the change events.

Each change stream event document contains a resume token. The Node.js driver automatically stores the resume token in the _id of the change event document.

The application can pass the resume token when creating a new change stream. The change stream will include all events that happened after the event associated with the given resume token.

The MongoDB Node.js driver will automatically attempt to reestablish connections in the event of transient network errors or elections. In those cases, the driver will use its cached copy of the most recent resume token so that no change stream events are lost.

In the event of an application failure or restart, the application will need to pass the resume token when creating the change stream in order to ensure no change stream events are lost. Keep in mind that the driver will lose its cached copy of the most recent resume token when the application restarts, so your application should store the resume token.

For more information and sample code for resuming change streams, see the official documentation.

#What are MongoDB Atlas Triggers?

Change streams allow you to react immediately to changes in your database. If you want to constantly be monitoring changes to your database, ensuring that your application that is monitoring the change stream is always up and not missing any events is possible... but can be challenging. This is where MongoDB Atlas Triggers come in.

MongoDB supports triggers in Atlas. Atlas Triggers allow you to execute functions in real time based on database events (just like change streams) or on scheduled intervals (like a cron job). Atlas Triggers have a few big advantages:

  • You don't have to worry about programming the change stream. You simply program the function that will be executed when the database event is fired.
  • You don't have to worry about managing the server where your change stream code is running. Atlas takes care of the server management for you.
  • You get a handy UI to configure your trigger, which means you have less code to write.

Atlas Triggers do have a few constraints. The biggest constraint I hit in the past was that functions did not support module imports (i.e. import and require). Recently, module import capabilities were added in beta. To learn more about functions and their constraints, see the official Stitch Function documentation.

#Create a MongoDB Atlas Trigger

Just as we did in earlier sections, let's look for new listings in the Sydney, Australia market. Instead of working locally in a code editor to create and monitor a change stream, we'll create a trigger in the Atlas web UI.

#Create a Trigger

Let's create an Atlas Trigger that will monitor the listingsAndReviews collection and call a function whenever a new listing is added in the Sydney, Australia market.

  1. Navigate to your project in Atlas.
  2. In the Services section of the left navigation pane, click Triggers.
  3. Click Add Trigger. The Add Trigger wizard will appear.
  4. In the Link Cluster(s) selection box, select your cluster that contains the sample_airbnb database and click Link. The changes will be deployed. The deployment may take a minute or two. Scroll to the top of the page to see the status.
  5. In the Select a cluster... selection box, select your cluster that contains the sample_airbnb database.
  6. In the Select a database name... selection box, select sample_airbnb.
  7. In the Select a collection name... selection box, select listingsAndReviews.
  8. In the Operation Type section, check the box beside Insert.
  9. In the Function code box, replace the commented code with a call to log the change event. The code should now look like the following:

    1
    2
    3
    exports = function(changeEvent) { console.log(JSON.stringify(changeEvent.fullDocument)); };
  10. We can create a $match statement to filter our change events just as we did earlier with the aggregation pipeline we passed to the change stream in our Node.js script. Expand the ADVANCED (OPTIONAL) section at the bottom of the page and paste the following in the Match Expression code box.

    1
    2
    3
    4
    { "fullDocument.address.country": "Australia", "fullDocument.address.market": "Sydney" }
  11. Click Save. The Trigger will be enabled. From that point on, the function to log the change event will be called whenever a new document in the Sydney, Australia market is inserted in the listingsAndReviews collection.

#Fire the Trigger

Now that we have the Trigger configured, let's create sample data that will fire the trigger.

  1. Return to the shell on your local machine.
  2. Create and update sample data by executing node changeStreamsTestData.js in a new shell.

#View the Trigger Results

When you created the Trigger, MongoDB Atlas automatically created a Stitch application for you named Triggers_StitchApp.

The function associated with your trigger doesn't currently do much. It simply prints the change event document. Let's view the results in the logs of the Stitch app associated with your trigger.

  1. Return to your browser where you are viewing your trigger in Atlas.
  2. In the Services section of the left navigation pane, click Stitch.
  3. In the Stitch Applications pane, click Triggers_StitchApp. The Triggers_StitchApp Stitch application will open.
  4. In the Manage section of the left navigation pane, click Logs. Two entries will be displayed in the Logs pane—one for each of the listings in the Sydney, Australia market that was inserted into the collection.
  5. Click the arrow at the beginning of each row in the Logs pane to expand the log entry. Here you can see the full document that was inserted.

If you insert more listings in the Sydney, Australia market, you can refresh the Logs page to see the change events.

#Wrapping Up

Today we explored four different ways to accomplish the same task of reacting immediately to changes in the database. We began by writing a Node.js script that monitored a change stream using Node.js's Built-in EventEmitter class. Next we updated the Node.js script to monitor a change stream using the MongoDB Node.js Driver's ChangeStream class. Then we updated the Node.js script to monitor a change stream using the Stream API. Finally, we created an Atlas trigger to monitor changes. In all four cases, we were able to use $match to filter the change stream events.

This post included many code snippets that built on code written in the first post of this MongoDB and Node.js Quick Start series. To get a full copy of the code used in today's post, visit the Node.js Quick Start GitHub Repo.

The examples we explored today all did relatively simple things whenever an event was fired: they logged the change events. Change streams and triggers become really powerful when you start doing more in response to change events. For example, you might want to fire alarms, send emails, place orders, update other systems, or do other amazing things.

This is the final post in the Node.js and MongoDB Quick Start Series (at least for now!). I hope you've enjoyed it! If you have ideas for other topics you'd like to see covered, drop a comment below.

#Additional Resources

#Series Versions

This blog post was created with the following application versions:

  • MongoDB: 4.0
  • MongoDB Node.js Driver: 3.3
  • Node.js: 10.16.3

More from this series

NodeJS Tutorials
  • Connect to a MongoDB Database Using Node.js
  • MongoDB and Node.js Tutorial - CRUD Operations
  • Aggregation Framework with Node.js Tutorial
  • How to Use MongoDB Transactions in Node.js
  • Change Streams & Triggers with Node.js Tutorial
MongoDB Icon
  • Developer Hub
  • Documentation
  • University
  • Community Forums

© MongoDB, Inc.