Kafka connector: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig

Dear experts,
running Kafka 2.7.0 by the means of Strimzi operator 0.22.1. Facing an issue with MongoDB Source Connector (by the way, MongoDB Sink Connector is working fine) with both Confluent MongoDB Connector 1.5.0 and 1.4.0.
Thx for your support.
Best Regards.

2021-04-12 14:48:34,634 WARN /connectors/mongo-source/config (org.eclipse.jetty.server.HttpChannel) [qtp1386677799-23] 
javax.servlet.ServletException: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig 
	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:410) 
	at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:346) 
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:366) 
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:319) 
	at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:205) 
	at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:763) 
	at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:563) 
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) 
	at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:1612) 
	at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) 
	at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1434) 
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) 
	at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:501) 
	at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:1582) 
	at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) 
	at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1349) 
	at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) 
	at org.eclipse.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234) 
	at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:179) 
	at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) 
	at org.eclipse.jetty.server.Server.handle(Server.java:516) 
	at org.eclipse.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:383) 
	at org.eclipse.jetty.server.HttpChannel.dispatch(HttpChannel.java:556) 
	at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:375) 
	at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:273) 
	at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) 
	at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:105) 
	at org.eclipse.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) 
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:336) 
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:313) 
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:171) 
	at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:129) 
	at org.eclipse.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:375) 
	at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:773) 
	at org.eclipse.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:905) 
	at java.base/java.lang.Thread.run(Thread.java:834) 
Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig 
	at org.glassfish.jersey.servlet.internal.ResponseWriter.rethrow(ResponseWriter.java:254) 
	at org.glassfish.jersey.servlet.internal.ResponseWriter.failure(ResponseWriter.java:236) 
	at org.glassfish.jersey.server.ServerRuntime$Responder.process(ServerRuntime.java:436) 
	at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:261) 
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:248) 
	at org.glassfish.jersey.internal.Errors$1.call(Errors.java:244) 
	at org.glassfish.jersey.internal.Errors.process(Errors.java:292) 
	at org.glassfish.jersey.internal.Errors.process(Errors.java:274) 
	at org.glassfish.jersey.internal.Errors.process(Errors.java:244) 
	at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:265) 
	at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:232) 
	at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:680) 
	at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:394) 
	... 35 more 
Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig

I get a similar reply from Kafka Connect REST interface. The connection.uri is not valid yet since my MongoDB setup is not ready yet. Could it be the root cause of this error ? Or is there an error in the uploaded jar file ?
Thx for your help.

[kafka@data-pulse-connect-7896656445-6hkhd kafka]$ curl -X PUT http://localhost:8083/connector-plugins/MongoSourceConnector/config/validate -H "Content-Type: application/json" -d '{
    "connector.class": "com.mongodb.kafka.connect.MongoSourceConnector",
    "tasks.max": "1",
    "topics": "test-topic"
}'
{"error_code":500,"message":"java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig"}

Once I logged into the kafaconnect pod, I can find the class com.mongodb.kafka.connect.source.MongoSourceConfig from unzip -v /opt/kafka/plugin/mongodb-kafka-connect-mongodb-1.5.0/mongo-kafka-1.5.0-all.jar

 946  Defl:N      449  53% 03-30-2021 11:16 08037467  com/mongodb/kafka/connect/source/json/formatter/DefaultJson.class
 449  Defl:N      236  47% 03-30-2021 11:16 abcf0628  com/mongodb/kafka/connect/source/Configurable.class
   27178  Defl:N    10773  60% 03-30-2021 11:16 3a0b8370  com/mongodb/kafka/connect/source/MongoSourceTask.class
   12486  Defl:N     5268  58% 03-30-2021 11:16 3b557e42  com/mongodb/kafka/connect/source/MongoCopyDataManager.class
   29429  Defl:N    10384  65% 03-30-2021 11:16 4b1378af  com/mongodb/kafka/connect/source/MongoSourceConfig.class
   0  Defl:N        2   0% 03-30-2021 11:16 00000000  com/mongodb/kafka/connect/source/topic/
   0  Defl:N        2   0% 03-30-2021 11:16 00000000  com/mongodb/kafka/connect/source/topic/mapping/
 268  Defl:N      191  29% 03-30-2021 11:16 479d034a  com/mongodb/kafka/connect/source/topic/mapping/TopicMapper.class
4987  Defl:N     2241  55% 03-30-2021 11:16 c88397f0  com/mongodb/kafka/connect/source/topic/mapping/DefaultTopicMapper.class
3564  Defl:N     1436  60% 03-30-2021 11:16 9e0601d8  com/mongodb/kafka/connect/source/MongoSourceConfig$1.class
1356  Defl:N      613  55% 03-30-2021 11:16 d0f294a9  com/mongodb/kafka/connect/source/MongoSourceConfig$OutputFormat.class
 415  Defl:N      287  31% 03-30-2021 11:16 70a7057b  com/mongodb/kafka/connect/Versions.class

Here https://mvnrepository.com/artifact/org.mongodb.kafka/mongo-kafka-connect/1.5.0 it shows that it needs avro and mongodb-driver-sync dependencies
I do not see any org.avro classes in mongo-kafka-1.5.0-all.jar … are you sure all of them are in the -all.jar?
no output for command jar tvf mongo-kafka-1.5.0-all.jar | grep avro

Here one line from error message output:
Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: org/apache/avro/Schema

I would like to inform you that I successfully setup my MongoDB Source Kafka Connector once I added avro and mongodb-driver-sync jar files … and also after lots of tuning.

I get the same errors.

Avro not found

java.lang.NoClassDefFoundError: org / apache / avro / Schema
Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: org / apache / avro / Schema

MongoSourceConfig not found

Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig
Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig

How have you built the KafkaConnect? Have you created your own image?

I have done it with this Dockerfile.

cat Dockerfile
FROM quay.io/strimzi/kafka:0.22.1-kafka-2.7.0
USER root: root
RUN mkdir -p / opt / kafka / plugins
COPY ./mongo-plugins/ / opt / kafka / plugins
USER 1001

At first I only had mongo-kafka-1.5.0-all.jar and then I added these versions of avro and mongodb-driver-sync and they keep giving me the same errors.

ls -lrt mongo-plugins /
total 2972
-rw-r - r-- 1 users 2310134 May 5 16:49 mongo-kafka-1.5.0-all.jar
-rw-r - r-- 1 users 137343 May 5 17:09 mongodb-driver-sync-4.2.1.jar
-rw-r - r-- 1 users 590599 May 5 17:10 avro-1.10.2.jar

Could you tell me how you have built the image and what additional tuning have you done?

Dear Alberto,
I guess you have the correct jar files. Did you check the target directory /opt/kafka/plugins/ content in the built image ? Either by the means of docker file command line RUN ls -lR /opt/kafka/plugins or by connecting your pod/container built from your image ?
Best regards. Richard.

The .jar libraries are in the container in the path. I connect and they are ok.
The path is / opt/kafka/plugins, right?
Is there a jar file missing?

[kafka@mongodb-connect-cluster-dual-connect-79c96cd5f-p64mn kafka]$ ls -lrt /opt/kafka/plugins/
total 2972
-rw-r--r-- 1 root root 2310134 May  5 14:49 mongo-kafka-1.5.0-all.jar
-rw-r--r-- 1 root root  137343 May  5 15:09 mongodb-driver-sync-4.2.1.jar
-rw-r--r-- 1 root root  590599 May  5 15:10 avro-1.10.2.jar 

KafkaConnector is created ok and it goes to ReadyState.

NAME                        DESIRED REPLICAS   READY
mongodb-connect-community   1                  True 
mongodb-connect-community-connect-7cf7f798d-n7c2q   1/1     Running   0          31m

The definition of my KafkaConnect is:

kind: KafkaConnect
metadata:
  name: mongodb-connect-community
  annotations:
    strimzi.io/use-connector-resources: "true"
spec:
  image: xxxxxxx
  replicas: 1
  bootstrapServers: mycluster-kafka-bootstrap:9093
  tls:
    trustedCertificates:
      - secretName: mycluster-cluster-ca-cert
        certificate: ca.crt
  config:
    config.storage.replication.factor: 1
    offset.storage.replication.factor: 1
    status.storage.replication.factor: 1
    config.providers: file
    config.providers.file.class: org.apache.kafka.common.config.provider.FileConfigProvider

The definition of my KafkaConnector is:

apiVersion: kafka.strimzi.io/v1alpha1
kind: KafkaConnector
metadata:
  name: mongodb-source-connector-community
  labels:
    strimzi.io/cluster: mongodb-connect-community
spec:
  class: com.mongodb.kafka.connect.MongoSourceConnector
  tasksMax: 1
  config:
    connection.uri: xxxxxxx
    topic.prefix: mongo
    database: test
    collection: ships
    copy.existing: true
    key.converter: org.apache.kafka.connect.json.JsonConverter
    key.converter.schemas.enable: false
    value.converter: org.apache.kafka.connect.json.JsonConverter
    value.converter.schemas.enable: false
    publish.full.document.only: true
    pipeline: >
      [{"$match":{"operationType":{"$in":["insert","update","replace"]}}},{"$project":{"_id":1,"fullDocument":1,"ns":1,"documentKey":1}}]

KafkaConnector is not Ready.

NAME                                 CLUSTER                     CONNECTOR CLASS                                  MAX TASKS   READY
mongodb-source-connector-community   mongodb-connect-community   com.mongodb.kafka.connect.MongoSourceConnector   1 

I get the error messages that the Mongodbsource and avro libraries cannot be found

Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig
javax.servlet.ServletException: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig
Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig
Caused by: java.lang.NoClassDefFoundError: Could not initialize class com.mongodb.kafka.connect.source.MongoSourceConfig

javax.servlet.ServletException: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: org/apache/avro/Schema

Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: org/apache/avro/Schema

Caused by: java.lang.NoClassDefFoundError: org/apache/avro/Schema

Caused by: java.lang.ClassNotFoundException: org.apache.avro.Schema

javax.servlet.ServletException: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: org/apache/avro/Schema

Caused by: org.glassfish.jersey.server.ContainerException: java.lang.NoClassDefFoundError: org/apache/avro/Schema

What am I doing wrong in the procedure?

Best Regards
Alberto

Could you actually check the missing class in your avro jar file from cmde line: jar tvf avro-1.10.2.jar | grep Schema.class

Here is my output:
1995 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$ArraySchema.class
498 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$BooleanSchema.class
490 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$BytesSchema.class
494 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$DoubleSchema.class
5075 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$EnumSchema.class
2869 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$FixedSchema.class
490 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$FloatSchema.class
482 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$IntSchema.class
486 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$LongSchema.class
1982 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$MapSchema.class
4401 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$NamedSchema.class
486 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$NullSchema.class
8106 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$RecordSchema.class
1124 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$SerializableSchema.class
494 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$StringSchema.class
4136 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema$UnionSchema.class
33910 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/Schema.class
510 Tue Mar 09 17:44:24 CET 2021 org/apache/avro/reflect/AvroSchema.class

My output is exactly the same:

     1995  03-09-2021 17:44   org/apache/avro/Schema$ArraySchema.class
      498  03-09-2021 17:44   org/apache/avro/Schema$BooleanSchema.class
      490  03-09-2021 17:44   org/apache/avro/Schema$BytesSchema.class
      494  03-09-2021 17:44   org/apache/avro/Schema$DoubleSchema.class
     5075  03-09-2021 17:44   org/apache/avro/Schema$EnumSchema.class
     2869  03-09-2021 17:44   org/apache/avro/Schema$FixedSchema.class
      490  03-09-2021 17:44   org/apache/avro/Schema$FloatSchema.class
      482  03-09-2021 17:44   org/apache/avro/Schema$IntSchema.class
      486  03-09-2021 17:44   org/apache/avro/Schema$LongSchema.class
     1982  03-09-2021 17:44   org/apache/avro/Schema$MapSchema.class
     4401  03-09-2021 17:44   org/apache/avro/Schema$NamedSchema.class
      486  03-09-2021 17:44   org/apache/avro/Schema$NullSchema.class
     8106  03-09-2021 17:44   org/apache/avro/Schema$RecordSchema.class
     1124  03-09-2021 17:44   org/apache/avro/Schema$SerializableSchema.class
      494  03-09-2021 17:44   org/apache/avro/Schema$StringSchema.class
     4136  03-09-2021 17:44   org/apache/avro/Schema$UnionSchema.class
    33910  03-09-2021 17:44   org/apache/avro/Schema.class
      510  03-09-2021 17:44   org/apache/avro/reflect/AvroSchema.class

What image do you use in the dockerfile?

I use Quay

Best Regards
Alberto

Same as yours:
FROM Quay

have found the fault in my procedure, it was the path where the plugins are placed.
It was not / opt /kafka /plugins but as you indicated your /opt/kafka/plugins/mongodb-kafka-connect-mongodb-1.5.0.
Once this is done, the connector works ok and goes to the ready state and the messages from the jar avro and source mongodb libraries do not appear.

mongodb-source-connector-community   mongodb-connect-community   com.mongodb.kafka.connect.MongoSourceConnector   1           True

Best Regards
Alberto

Once the KafkaConnector works, I have seen in the Kafkaconnect logs that it connects to the MONGODB DB, but it gives this WARN No topic set error message. Could not publish the message.

2021-05-06 15:05:19,134 INFO WorkerSourceTask{id=mongodb-source-connector-community-0} Source task finished initialization and start (org.apache.kafka.connect.runtime.WorkerSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,172 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093af3dd3115520492995fa"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093af3dd3115520492995fa"}}, "fullDocument": {"_id": {"$oid": "6093af3dd3115520492995fa"}, "name": "USS Enterprise-D", "operator": "Starfleet", "type": "Explorer", "class": "Galaxy", "crew": 750.0, "codes": [10.0, 11.0, 12.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,175 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093af3dd3115520492995fb"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093af3dd3115520492995fb"}}, "fullDocument": {"_id": {"$oid": "6093af3dd3115520492995fb"}, "name": "USS Prometheus", "operator": "Starfleet", "class": "Prometheus", "crew": 4.0, "codes": [1.0, 14.0, 17.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,177 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093af3dd3115520492995fc"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093af3dd3115520492995fc"}}, "fullDocument": {"_id": {"$oid": "6093af3dd3115520492995fc"}, "name": "USS Defiant", "operator": "Starfleet", "class": "Defiant", "crew": 50.0, "codes": [10.0, 17.0, 19.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,178 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093af3dd3115520492995fd"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093af3dd3115520492995fd"}}, "fullDocument": {"_id": {"$oid": "6093af3dd3115520492995fd"}, "name": "IKS Buruk", "operator": " Klingon Empire", "class": "Warship", "crew": 40.0, "codes": [100.0, 110.0, 120.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,183 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093af3dd3115520492995fe"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093af3dd3115520492995fe"}}, "fullDocument": {"_id": {"$oid": "6093af3dd3115520492995fe"}, "name": "IKS Somraw", "operator": " Klingon Empire", "class": "Raptor", "crew": 50.0, "codes": [101.0, 111.0, 120.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,185 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093af3dd3115520492995ff"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093af3dd3115520492995ff"}}, "fullDocument": {"_id": {"$oid": "6093af3dd3115520492995ff"}, "name": "Scimitar", "operator": "Romulan Star Empire", "type": "Warbird", "class": "Warbird", "crew": 25.0, "codes": [201.0, 211.0, 220.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,188 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093af3ed311552049299600"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093af3ed311552049299600"}}, "fullDocument": {"_id": {"$oid": "6093af3ed311552049299600"}, "name": "Narada", "operator": "Romulan Star Empire", "type": "Warbird", "class": "Warbird", "crew": 65.0, "codes": [251.0, 251.0, 220.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]
2021-05-06 15:05:19,189 WARN No topic set. Could not publish the message: {"_id": {"_id": {"$oid": "6093bf4dd311552049299601"}, "copyingData": true}, "documentKey": {"_id": {"$oid": "6093bf4dd311552049299601"}}, "fullDocument": {"_id": {"$oid": "6093bf4dd311552049299601"}, "name": "USS Enterprise-D", "operator": "Starfleet", "type": "Explorer", "class": "Galaxy", "crew": 750.0, "codes": [10.0, 11.0, 12.0]}} (com.mongodb.kafka.connect.source.MongoSourceTask) [task-thread-mongodb-source-connector-community-0]

It seems that it connects to the MONFGO DB, but then it cannot copy it to a topic because it is not defined.

The topic in Kafka is supposed to be built with topic_prefix.database.collection and in my case it is mongo.test.ships.
This topic is created on the fly or it is necessary to create it previously.

Did you have this error?

Am I missing something in the KafkaConnector yaml?

kind: KafkaConnector
metadata:
  name: mongodb-source-connector-community
  labels:
    strimzi.io/cluster: mongodb-connect-community
spec:
  class: com.mongodb.kafka.connect.MongoSourceConnector
  tasksMax: 1
  config:
    connection.uri:xxxxxxxxxxxx
    topic.prefix: mongo
    database: test
    collection: ships
    copy.existing: true
    key.converter: org.apache.kafka.connect.json.JsonConverter
    key.converter.schemas.enable: false
    value.converter: org.apache.kafka.connect.json.JsonConverter
    value.converter.schemas.enable: false
    publish.full.document.only: true
    pipeline: >
      [{"$match":{"operationType":{"$in":["insert","update","replace"]}}},{"$project":{"_id":1,"fullDocument":1,"ns":1,"documentKey":1}}]

Best regards
Thanks
Alberto

Sorry, I have checked the operation and everything works OK. Both the Source and SINK connectors work OK.
Thank you

I’m able to start kafka connect without mongo source connector. But when I try to start with source connector as well, I get following error:

java.util.concurrent.ExecutionException: java.lang.NoClassDefFoundError: org/apache/avro/Schema

my worker properties look like:

bootstrap.servers=localhost:9092

key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false

value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false

offset.storage.file.filename=/tmp/connect.offsets
offset.flush.interval.ms=10000

plugin.path=/opt/homebrew/Cellar/kafka/3.6.1/connectorLib

my mongo source connector properties look like:

connector.class=com.mongodb.kafka.connect.MongoSourceConnector
connection.uri=mongodb://localhost:27017/changeStream?replicaSet=rs0
database=changeStream
collection=author

output.format.key=json
output.format.value=json

key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false

value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false

pipeline=[{"$match": {"operationType": "insert"}}, {"$project": { "fullDocument._id": 0} } ]

What am I doing wrong here? I start kafka connect like:

bin/connect-standalone connectorConfig/demoMongoConfig/worker.properties connectorConfig/demoMongoConfig/source.properties

Please help.