Watch keynotes and sessions from, our virtual developer conference.

Does Spark connector support "INSERT IGNORE"?

I’m looking for “INSERT IGNORE” feature in mongodb-spark-connector.
There’s a unique key with multiple column in mongdb, and I wrote some daily batch running on spark. The batch should be retriable, I mean idempotent, so when writing to db, I want to ignore some duplicate error cases.
I’ve seen SaveMode.Overwrite implementation, but it just drop the collection. This is not I’m looking for.

Is there a way to insert and ignore the duplication error?