If there is any way to update docs without delete and insert or update/replace one?

Hello everyone. I have a set of data (around 100 docs every one sec).
I use php Driver and Mongo 3.2
So, when I need update data I was using:

    foreach($dataset as $key=>$data){
           $filter['id'] = $data['id'];
           $options = array('upsert'=>true);

But when I have around 200 docs per second, I see, my Mongo is loaded a lot. So I changed in:

$array_data = getData(); // here we collect updated data docs
$array_data_ids = getIds(); // here we collect data_ids for filter
$filter['id'] = array('$in'=>$array_data_ids);
$mongo->collection->deleteMany($filter); // delete old docs with ids of new data
$mongo->collection->insertMany($array_data); // insert new data

And I see, that this method works better than previous and not overload the mongo.
But sometimes I get errors for Duplicate key error. Because the same collection uses other one process for same data.

The question is:
If there is any way to update docs without delete and insert or update/replace one?

Hey @1114

It seems that you are looking for updateMany <- php driver
mongo shell updateMany

Maybe I don’t understand clear how can I use it.
But as I know, updateMany > Updates all documents that match the specified filter for a collection.
And there is no specified filter for a collection.
We speak for every one document in collection.
I looking for replaceMany function with upsert:true. So when I have the same unique key in collection, function should update data. When it doesn’t have -> just simple insertOne execute.
I think, something like this.
Anybody know something like this?

Then maybe do a bulk write with an array of updateOne calls.
That way you can have the _id as the query conditions and have access to an upsert flag.


Thank you very much. It was exactly what I was looking for. And I did some test to see, how much time it takes. The same time with delete and insert many in 1000 records.
Thanks a lot!

1 Like