我正在构建我的spark环境,参考http://spark.apache.org/docs/latest/building-spark.html#spark-tests-in-maven.但是当我使用这个命令:"mvn -Pyarn -Phadoop-2.3 -DskipTests -Phive -Phive-thriftserver clean package"时,我遇到了一些错误.
[error] bad symbolic reference. A signature in WebUI.class refers to term eclipse
[error] in package org which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling WebUI.class.
[error] bad symbolic reference. A signature in WebUI.class refers to term jetty
[error] in value org.eclipse which is not available.
[error] It may be completely missing from the current classpath, or the version on
[error] the classpath might be incompatible with the version used when compiling WebUI.class.
[error]
[error] while compiling: /download_wlh/spark-1.6.0/sql/core/src/main/scala/org/apache/spark/sql/SQLContext.scala
[error] during phase: erasure
[error] library version: version 2.10.5
[error] compiler version: version 2.10.5
Run Code Online (Sandbox Code Playgroud)
有人问一个类似的问题.指的奇怪的错误消息:坏符号引用.package.class中的签名是指包org中的术语apache,它不可用
但是没有提到解决方案.
小智 5
我尝试使用Java 8和Scala 2.11使用Hadoop 2.4进行Spark分发时遇到了同样的问题.
我最初的尝试是: ./make-distribution.sh --name hadoop-2.4-custom --tgz -Phadoop-2.4 -Dscala-2.11
这产生了Spark SQL 1.5构建失败中描述的错误
根据该帖子的建议,我跑了./dev/change-version-to-2.11.sh然后省略了-Dscala-2.11建议的评论之一.这产生了您的确切错误.当我重新添加scala标志时,构建通过了.总之,我修复了所有问题:
./dev/change-version-to-2.11.sh
./make-distribution.sh --name hadoop-2.4-custom --tgz -Phadoop-2.4 -Dscala-2.11