Spark MongDB connectivity Issue

I am following your lessons and courses for Spark Java development.
Sorry for the trouble, but i am stuck in the for almost 2 - 3 days and need a push or assistance to move forward.

Hope you are doing well during this Covid Situation,

I am using the sample code from GIT and trying to connect to Mongo DB from Spark.

I am facing and issue when trying to read data from MongoDB.

Following is my Spark details,

I have Spark 1.6.3 which has Scala 2.10.5

I am using the Mongo DB Connector Version 1.1 and package 2.10

Following is the dependencies i had used in my Mavan

    <dependency>
    <groupId>org.mongodb.spark</groupId>
    <artifactId>mongo-spark-connector_2.10</artifactId>
    <version>1.1.0</version>
    </dependency>

Getting the following Error ,

Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/spark/MongoSpark

at com.virtualpairprogrammers.JavaIntroduction.main(JavaIntroduction.java:27)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:750)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.spark.MongoSpark
at [java.net](https://java.net/).URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 10 more

Below is the simple Liner Code,

SparkConf conf = new SparkConf()
.setAppName("Load Data from Mongo DB")
.set("spark.app.id","MongoSparkConnectorTour")
.set("spark.mongodb.input.uri", "mongodb://uname:prod@Host:PortNo/DB.CollectionName")
.set("spark.mongodb.output.uri", mongodb://uname:prod@Host:PortNo/DB.CollectionName");
JavaSparkContext sc = new JavaSparkContext(conf);
JavaMongoRDD<Document> customRdd = MongoSpark.load(sc);
System.out.println("Download Completed");
System.out.println( "Count of Data downloaded " + customRdd.count());
customRdd.saveAsTextFile("/bn_data/Testing/mongoDBData/", GzipCodec.class);

If you can give me a pathforward for this it will be helpfull. I have also tried running with the --jar and other options, but still it throws the same error.

Also, i have not downloaded this mongo DB package in my Spark (I am not sure if issue is happening because of that).

Thanks in Advance

I was able to fix this issue at last.
Built it as a Fat jar and it is working for me without any issues.