使用azure-cosmosdb-spark时库的版本冲突

Art*_*r.A 5 scala azure sbt apache-spark azure-cosmosdb

我正在尝试编写一个使用azure-cosmosdb-spark连接到CosmosDB的Spark应用程序.但是,即使我使用正确版本的Spark和Scala,我仍然会遇到依赖冲突.我设法使用具有相同Spark版本的Azure Databricks集群上的连接器,所以我有点迷失在这里的问题.

我已经阅读了这些帖子(开发scala spark应用程序,连接到azure CosmosDBSpark库,当cosmosdb Lib时),但仍然无法解决我的问题.

以下是我尝试使用连接器的SBT配置的一部分:

sparkVersion in ThisBuild := "2.2.0" // I also tried "2.2.1"
sparkComponents in ThisBuild += "mllib"
spIgnoreProvided in ThisBuild := true

scalaVersion in ThisBuild := "2.11.12"
parallelExecution in ThisBuild := false
scalacOptions in Compile ++= Seq("-implicits", "-feature")

lazy val root = (project in file("."))
  .aggregate(shaker, ...)
  .settings(Publish.notPublished: _*)

lazy val shaker = project
  .settings(name := "project-name")
  .settings(libraryDependencies += "com.github.pureconfig" %% "pureconfig" % "0.9.0")
  .settings(libraryDependencies += "com.github.scopt" %% "scopt" % "3.7.0")
  .settings(libraryDependencies += "com.microsoft.azure" % "azure-cosmosdb-spark_2.2.0_2.11" % "1.1.1")
  .settings(scalacOptions += "-Xmacro-settings:materialize-derivations")
Run Code Online (Sandbox Code Playgroud)

运行SBT时出现以下错误:

[error] (shaker/*:update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.json4s:json4s-ast, org.apache.spark:spark-network-shuffle, com.twitter:chill, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-core, org.apache.spark:spark-network-common
Run Code Online (Sandbox Code Playgroud)

谢谢您帮忙 !