Is there a way to specify more than one MongoDB host in SparkConf? The examples in the doc (https://docs.mongodb.com/spark-connector/v1.1/configuration/) seems to suggest that no.
spark.mongodb.output.uri=mongodb://127.0.0.1/
spark.mongodb.output.database=test
spark.mongodb.output.collection=myCollection
The spark.mongodb.input.uri
and spark.mongodb.output.uri
accepts MongoDB Connection URI format.
The connection URI format should work across all MongoDB supported drivers, including the MongoDB Scala driver for Spark. For example, in order to connect to a replica set, you can specify:
mongodb://db1.example.net,db2.example.net:2500/?replicaSet=myReplica