MongoDB.live, free & fully virtual. Register Now MongoDB.live, free & fully virtual. Register Now

Spark plugin - too many open cursors

I am getting this error:

Caused by: com.mongodb.MongoCommandException: Command failed with error 2: 'Cannot open a new cursor since too many cursors are already opened' on server server_dns:27017. The full response is {"ok": 0.0, "errmsg": "Cannot open a new cursor since too many cursors are already opened", "code": 2}

I think that the plugin has too many connections to the database. I have tried appending

&MaxPoolSize=1

to the mongodb URL and there is nothing here about how to limit the number of open cursors. I cannot increase the number of allowed open cursors in the database configuration itself.

How do I limit/control the number of open cursors used by the Apache Spark plugin for MongoDB?

Hi @Dan_S,

Welcome to MongoDB community!

The maxPoolSize limit amount of connections but not cursors which are controlled by the application when specifying a timeout for them (also they should be timedout on server after 10min)

You should be able to kill cursors:

Search if on spark side you are not opening cursors with noTimeOut flag or just many simultaneously running queries.

Best
Pavel

Hi Pavel,

Thanks for your response.

The problem is that in Spark I am using the plugin to MongoDB for Spark which provides a much higher level interface to the database than the pymongo module. I assume (just by observing the behaviour of the plugin) that it is aggressively opening multiple cursors to the database, but I do not believe that this behaviour can be controlled by the plugin’s user.

Hi @Dan_S,

I can see if our spark.team colud help us more. Can you share your MongoDB version, topology and spark connector version?

Best
Pavel