I have just mongo shell installed, not full Enterprise package. To import data from people.json file I decided to implement a python script:
#!/usr/bin/env python3
import argparse
from bson import json_util
import pymongo
parser = argparse.ArgumentParser()
parser.add_argument("uri", type=str)
parser.add_argument("database", type=str)
parser.add_argument("collection", type=str)
parser.add_argument("file", type=argparse.FileType("rb"))
parser.add_argument("--clean", action="store_true")
args = parser.parse_args()
client = pymongo.MongoClient(args.uri)
collection = client[args.database][args.collection]
if args.clean:
res = collection.delete_many({})
print("Deleted:", res.deleted_count)
docs = []
for line in args.file:
data = json_util.loads(line, json_options=json_util.RELAXED_JSON_OPTIONS)
docs.append(data)
if len(docs) > 100:
collection.insert_many(docs, ordered=False)
docs = []
print("1", end="", flush=True)
if len(docs) > 0:
collection.insert_many(docs, ordered=False)
docs = []
print("1", end="")
print("\nSUCCESS")
to run it you need to have installed pymongo and dnspython:
$ pip install dnspython pymongo
then you can use it:
$ python3 importer.py --clean “mongodb+srv://<username>:<password>@<cluster>.mongodb.net/admin” ppl people people.json
- ppl is a database to upload
- people is a collection
- people.json is a file with data
It loads packs of 100 records to avoid a timeout