使用SBT构建Scala/Spark项目时出现警告

Ser*_*nov 31 scala intellij-idea sbt apache-spark

我正在尝试使用以下内容在IntelliJ Idea中构建Scala/Spark项目build.sbt:

name := "try"

version := "1.0"

scalaVersion := "2.11.8"

val sparkVersion = "2.2.0"

resolvers ++= Seq(
  "apache-snapshots" at "http://repository.apache.org/snapshots/"
)

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-sql" % sparkVersion,
  "org.apache.spark" %% "spark-mllib" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion,
  "org.apache.spark" %% "spark-hive" % sparkVersion
)
Run Code Online (Sandbox Code Playgroud)

并得到一堆警告:

8/6/17
1:29 PM SBT project import
                [warn] Found version conflict(s) in library dependencies; some are suspected to be binary incompatible:
                [warn]  * io.netty:netty:3.9.9.Final is selected over {3.6.2.Final, 3.7.0.Final}
                [warn]      +- org.apache.spark:spark-core_2.11:2.2.0             (depends on 3.9.9.Final)
                [warn]      +- org.apache.zookeeper:zookeeper:3.4.6               (depends on 3.6.2.Final)
                [warn]      +- org.apache.hadoop:hadoop-hdfs:2.6.5                (depends on 3.6.2.Final)
                [warn]  * commons-net:commons-net:2.2 is selected over 3.1
                [warn]      +- org.apache.spark:spark-core_2.11:2.2.0             (depends on 2.2)
                [warn]      +- org.apache.hadoop:hadoop-common:2.6.5              (depends on 3.1)
                [warn]  * com.google.guava:guava:11.0.2 is selected over {12.0.1, 16.0.1}
                [warn]      +- org.apache.hadoop:hadoop-yarn-client:2.6.5         (depends on 11.0.2)
                [warn]      +- org.apache.hadoop:hadoop-yarn-api:2.6.5            (depends on 11.0.2)
                [warn]      +- org.apache.hadoop:hadoop-yarn-common:2.6.5 
Run Code Online (Sandbox Code Playgroud)

我有几个,也许是愚蠢的问题:

  1. 有没有更好的结构方法build.sbt(添加其他解析器,例如?),以便我可以摆脱警告?
  2. 我应该关心警告吗?

Eug*_*Loy 24

有没有更好的方法来构建build.sbt(添加其他解析器,例如?),以便我可以摆脱警告?

一种方法是手动告诉sbt您喜欢哪些依赖项,对于您的情况:

dependencyOverrides ++= Set(
  "io.netty" % "netty" % "3.9.9.Final",
  "commons-net" % "commons-net" % "2.2",
  "com.google.guava" % "guava" % "11.0.2"
)
Run Code Online (Sandbox Code Playgroud)

我还建议阅读sbt中的冲突管理.

我应该关心警告吗?

在你的情况下 - 不,因为你的冲突源于仅使用在相同版本下发布的与火花相关的工件.Spark是一个具有大用户群的项目,由于传递依赖性而导致jar地狱的可能性相当低(尽管从技术上讲不保证).

一般情况下 - 也许.通常情况下,在大多数情况下都可以,但是可能需要仔细手动依赖性解决的问题很小(如果可能的话).在这些情况下,在运行应用程序之前很难判断是否存在问题,并且遇到一些问题,例如缺少类,方法,不匹配方法签名或某些与反射相关的问题.

  • @SergeyBushmanov:使用`"io.netty"%"netty"%"3.9.9.Final"`,然后在`sbt`控制台中运行`reload`和`update`. (4认同)