相关疑难解决方法(0)

尝试创建jar时无法解决依赖关系错误

我正在尝试构建一个Scala jar文件以在spark中运行它.
我正在学习本教程.
试图为使用SBT建jar文件时,在这里,我面临着以下错误

[info] Resolving org.apache.spark#spark-core_2.10.4;1.0.2 ...
[warn]  module not found: org.apache.spark#spark-core_2.10.4;1.0.2
[warn] ==== local: tried
[warn]   /home/hduser/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.0.2/ivys/ivy.xml
[warn] ==== Akka Repository: tried
[warn]   http://repo.akka.io/releases/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn] ==== public: tried
[warn]   http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.0.2/spark-core_2.10.4-1.0.2.pom
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::          UNRESOLVED DEPENDENCIES         ::
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  :: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[error] {file:/home/prithvi/scala/asd/}default-d57abf/*:update: sbt.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.10.4;1.0.2: not found
[error] Total time: 2 s, completed 13 Aug, 2014 5:24:24 PM
Run Code Online (Sandbox Code Playgroud)

问题是什么以及如何解决它.


依赖问题已得到解决.谢谢"om-nom-nom",
但出现了新的错误

[warn]  ::::::::::::::::::::::::::::::::::::::::::::::
[warn]  ::              FAILED DOWNLOADS …
Run Code Online (Sandbox Code Playgroud)

scala sbt apache-spark

21
推荐指数
1
解决办法
2万
查看次数

标签 统计

apache-spark ×1

sbt ×1

scala ×1