Creating a unique constraint with partial filter expressions

I am trying to create a unique constraint with partialfilterexpression. The partial filter expression only applies the last condition. Is there a better way to handle multiple conditions with creating indexes with partial filter expression. It should creating indexes where type equals ‘ROLE’ or ‘RANK’. But it is only setting to last condition statement.

Welcome to the community, @Michael_Sor!

Notice, that you use same key for two different fields in the object:

{ $eq: 'RANK', $eq: 'ROLE' }

A MongoDB query object can not have two keys with the same name (in your case it is ‘$eq’). If you try to declare two keys with same name in an object, the last one (in your case - '$eq: ‘ROLE’) would overwrite all the previous ones.

You can not use $or operator as it is not supported by partialFilterExpression.

Alternatively, on application-level, you can add field ‘useByPartialIndex’ and set it to ‘true’, if the ‘type’ is either ‘RANK’ or ‘ROLE’. And then include that prop in your partialFIlterExpression.

Hi @Michael_Sor,

I would suggest testing the following partial expression:

{ "type" : [ "ROLE" , "RANK"] }

MongoDB knows how to equal scalar and arrays as a “in” expression.

Let us know if that works.

Best regards
Pavel

Thanks for the prompt response. When I tried the array method, it still allowed duplicate entries

Hi @Michael_Sor,

Not sure what you mean by duplicate entries. Can you show the documents that have name,type duplicate with one of the types specified?

Hey @Pavel_Duchovny,

The name and type should be unique. But it allows me to add duplicate entry of name:‘E-2’ and type: ‘RANK’. I am expected MongoDB to complain that ‘E-2’ of type Rank already exists.

Mike

Oh I see…:frowning:

The unique constraint is looking on the query shape and not attempting to use the index therefore will enforce only a value of the exact array.

Yea, in this case you need a compute field or separate the unique into a different collection

Hi @Michael_Sor,

Actually I have a nice workaround for you suggested by a colleague of mine .

You can workaround the several type uniqueness by creating a partial index for each type. The trick is to add a dummy field, which will never be written in your documents, to the keys object to allow us creating mulitple indexes on the “name” and “type”:

db.unqTest.createIndex( { "name" : 1, "type" : 1, "role_unq" : 1}, {"unique" : true, "partialFilterExpression" : { "type" : "ROLE" } });
db.unqTest.createIndex( { "name" : 1, "type" : 1, "rank_unq" : 1}, {"unique" : true, "partialFilterExpression" : { "type" : "RANK" } });
db.unqTest.createIndex( { "name" : 1, "type" : 1, "class_unq" : 1}, {"unique" : true, "partialFilterExpression" : { "type" : "CLASSIFICATION" } });

Now we can only insert one name&type document with one of those types:

> db.unqTest.insert({name : "E-2" , "type" : "ROLE"});
WriteResult({ "nInserted" : 1 })
> db.unqTest.insert({name : "E-2" , "type" : "ROLE"});
WriteResult({
	"nInserted" : 0,
	"writeError" : {
		"code" : 11000,
		"errmsg" : "E11000 duplicate key error collection: test.unqTest index: name_1_type_1_role_unq_1 dup key: { name: \"E-2\", type: \"ROLE\", role_unq: null }"
	}
})
> db.unqTest.insert({name : "E-2" , "type" : "CLASSIFICATION"});
WriteResult({ "nInserted" : 1 })
> db.unqTest.insert({name : "E-2" , "type" : "CLASSIFICATION"});
WriteResult({
	"nInserted" : 0,
	"writeError" : {
		"code" : 11000,
		"errmsg" : "E11000 duplicate key error collection: test.unqTest index: name_1_type_1_class_unq_1 dup key: { name: \"E-2\", type: \"CLASSIFICATION\", class_unq: null }"
	}
})

While other types allow that:

> db.unqTest.insert({name : "E-2" , "type" : "DUMMY"});
WriteResult({ "nInserted" : 1 })
> db.unqTest.insert({name : "E-2" , "type" : "DUMMY"});
WriteResult({ "nInserted" : 1 })
>

Please let me know if you have any additional questions.

Best regards,
Pavel

1 Like

Thanks Paul. I will look into this, to see if this how we want to proceed.