Importing JSON with few large documents

There are millions of documents in json, some of which are over 16 MB.
And I want to import this json.

However, large documents(over 16MB) will not be imported.

Is there a way to import those documents?
Is there only way to use “mongofiles”?

If you are importing JSONs/documents over 16Mb, the only option is to use GridFS and mongofiles. If you have these big JSONs because of media content like images and all, you can reshape your data to store these media on some other service like S3/ object based storage, and then put the reference on those media in these JSONs, bringing it very low in size.

If you have large JSONs because of too much array data in your JSONs, you need to model your documents properly as arrays can grow out of bound indefinite and that is not a good thing.

Can you please mention why these JSONs are so big for you? What does it contain?

1 Like

OMG… I’ll have to model.
OK, Thank you. :smiley:

They are just factory generated metadata.

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.