Ticket: Migration

Hello, I am unable to pass the migration test although the code seems like nothing wrong.

The code:

const MongoClient = require("mongodb").MongoClient
const ObjectId = require("mongodb").ObjectId
const MongoError = require("mongodb").MongoError
require("dotenv").config()

/**
 * Ticket: Migration
 *
 * Update all the documents in the `movies` collection, such that the
 * "lastupdated" field is stored as an ISODate() rather than a string.
 *
 * The Date.parse() method build into Javascript will prove very useful here!
 * Refer to http://mongodb.github.io/node-mongodb-native/3.1/tutorials/crud/#bulkwrite
 */

// This leading semicolon (;) is to signify to the parser that this is a new expression. This expression is an
// Immediately Invoked Function Expression (IIFE). It's being used to wrap this logic in an asynchronous function
// so we can use await within.
// To read more about this type of expression, refer to https://developer.mozilla.org/en-US/docs/Glossary/IIFE
;(async () => {
  try {
    const host = process.env.MFLIX_DB_URI
    const client = await MongoClient.connect(host, { useNewUrlParser: true })
    const mflix = client.db(process.env.MFLIX_NS)

    // TODO: Create the proper predicate and projection
    // add a predicate that checks that the `lastupdated` field exists, and then
    // check that its type is a string
    // a projection is not required, but may help reduce the amount of data sent
    // over the wire!
    const predicate = { lastupdated: { $exists: true} }
    const projection = {}
    const cursor = await mflix
      .collection("movies")
      .find(predicate, projection)
      .toArray()
    const moviesToMigrate = cursor.map(({ _id, lastupdated }) => ({
      updateOne: {
        filter: { _id: ObjectId(_id) },
        update: {
          $set: { lastupdated: new Date(Date.parse(lastupdated)) },
        },
      },
    }))
    console.log(
      "\x1b[32m",
      `Found ${moviesToMigrate.length} documents to update`,
    )
    // TODO: Complete the BulkWrite statement below
    const { modifiedCount } = await mflix
    .collection("movies")
    .bulkWrite(moviesToMigrate)

    console.log("\x1b[32m", `${modifiedCount} documents updated`)
    client.close()
    process.exit(0)
  } catch (e) {
    if (
      e instanceof MongoError &&
      e.message.slice(0, "Invalid Operation".length) === "Invalid Operation"
    ) {
      console.log("\x1b[32m", "No documents to update")
    } else {
      console.error("\x1b[31m", `Error during migration, ${e}`)
    }
    process.exit(1)
  }
})()

The output:

First, your are not supposed to post code that is a potential answer.

You are missing at least the following:

 const predicate = { lastupdated: { $exists: true, $type: "string" } }

This does not work and gives the same error as above.

Tried this solution (Ticket: Migration - bulkWrite) but still doesn’t work.
The output of node movie-last-updated-migration.js

Hi @Yi_32599,

The code seems right if you have added the predicate. :slight_smile: Can you check if the host is mapping correctly from the config file?
You can have console.log inside and lets see if we are getting it right.

Kanika

Hello @kanikasingla,

I have tried 2 different ways of connecting to my cluster. The one mentioned above uses method 2.
I have also tried reinstalling node_modules as suggested by StackOverflow but to no avail.

1. Default code:
const host = process.env.MFLIX_DB_URI
const client = await MongoClient.connect(host, { useNewUrlParser: true })
const mflix = client.db(process.env.MFLIX_NS)

console.log(host)

Node output:
Error during migration, MongoParseError: URI malformed, cannot be parsed

npm test output:

2. Using Connection String in client

    const host = process.env.MFLIX_DB_URI
const client = await MongoClient.connect("mongodb+srv://m220student:m220password@mflix-dij8z.mongodb.net/test?retryWrites=true&w=majority", { useNewUrlParser: true })
const mflix = client.db("mflix")

console.log(client)

Node output:



image

npm test output:

This should be sample_mflix. Make sure you are using the latest dataset and handouts, otherwise it will give different results.

Kanika