小编Pav*_*lov的帖子

SBT上Spark 2.1.0的"未解决依赖"

version:="1.0"
scalaVersion:="2.11.8"
ivyScala:= ivyScala.value map {_.copy(overrideScalaVersion = true)}
libraryDependencies + ="org.apache.spark"%%"spark-core"%" 2.1.0"

当我尝试通过sbt组装jar时,我试图让我的开发环境产生火花,但它失败并在我的sbt中显示[error],如下所示:

[warn]  :::::::::::::::::::::::::::::::::::::::::::::: <br/>
[warn]  ::          UNRESOLVED DEPENDENCIES         :: <br/>
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: <br/>
[warn]  :: org.apache.spark#spark-core_2.11;2.1.0: not found <br/>
[warn]  :::::::::::::::::::::::::::::::::::::::::::::: <br/>
[warn] 
[warn]  Note: Unresolved dependencies path: <br/>
[warn]      org.apache.spark:spark-core_2.11:2.1.0  (D:\MyDocument\IDEA\Scala\model\build.sbt#L9-10) <br/>
[warn]        +- org.apache.spark:spark-catalyst_2.11:2.1.0 <br/>
[warn]        +- org.apache.spark:spark-sql_2.11:2.1.0 (D:\MyDocument\IDEA\Scala\model\build.sbt#L15-16) <br/>
[warn]        +- org.apache.spark:spark-hive_2.11:2.1.0 (D:\MyDocument\IDEA\Scala\model\build.sbt#L11-12) <br/>
[warn]        +- default:producttagmodel_2.11:1.0 <br/>
[trace] Stack trace suppressed: run 'last *:update' for the full output. <br/>
[error] (*:update) sbt.ResolveException: unresolved dependency: …
Run Code Online (Sandbox Code Playgroud)

scala intellij-idea sbt

5
推荐指数
1
解决办法
4185
查看次数

标签 统计

intellij-idea ×1

sbt ×1

scala ×1