Ban*_*low 10 scala sbt apache-spark
我是spark的初学者.我构建了一个环境使用"linux + idea + sbt",当我尝试Spark的快速启动时,我遇到了问题:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/SparkConf
at test$.main(test.scala:11)
at test.main(test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkConf
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 7 more
Run Code Online (Sandbox Code Playgroud)
我的磁盘中的版本:
sbt = 0.13.11
jdk = 1.8
scala = 2.10
idea = 2016
Run Code Online (Sandbox Code Playgroud)
我的目录结构:
test/
idea/
out/
project/
build.properties
plugins.sbt
src/
main/
java/
resources/
scala/
scala-2.10/
test.scala
target/
assembly.sbt
build.sbt
Run Code Online (Sandbox Code Playgroud)
在build.properties中:
sbt.version = 0.13.8
Run Code Online (Sandbox Code Playgroud)
在plugins.sbt中:
logLevel := Level.Warn
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.11.2")
Run Code Online (Sandbox Code Playgroud)
在build.sbt中:
import sbt._
import Keys._
import sbtassembly.Plugin._
import AssemblyKeys._
name := "test"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.6.1" % "provided"
Run Code Online (Sandbox Code Playgroud)
在assembly.sbt中:
import AssemblyKeys._ // put this at the top of the file
assemblySettings
Run Code Online (Sandbox Code Playgroud)
在test.scala中:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
object test {
def main(args: Array[String]) {
val logFile = "/opt/spark-1.6.1-bin-hadoop2.6/README.md" // Should be some file on your system
val conf = new SparkConf().setAppName("Test Application")
val sc = new SparkContext(conf)
val logData = sc.textFile(logFile, 2).cache()
val numAs = logData.filter(line => line.contains("a")).count()
val numBs = logData.filter(line => line.contains("b")).count()
println("Lines with a: %s, Lines with b: %s".format(numAs, numBs))
}
}
Run Code Online (Sandbox Code Playgroud)
我怎么解决这个问题.
Ser*_*gey 15
具有"provided"范围的依赖关系仅在编译和测试期间可用,并且在运行时或打包时不可用.因此,不应该test使用a 创建一个对象,而main应该将它放在一个实际的测试套件中src/test/scala(如果您不熟悉Scala中的单元测试,我建议使用ScalaTest,例如.首先添加依赖项它在你的build.sbt中:libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % Test然后转到这个快速入门教程来实现一个简单的规范).
另一个选项,在我看来相当hacky(但仍然是技巧),涉及在某些配置中provided从您的spark-core依赖项中删除范围,并在此问题的接受答案中进行了描述.
我今天早上遇到了同样的错误,提供了错误.我删除了"提供"并运行sbt clean,reload,compile,package,run.我还使用命令行中的spark-submit进行测试.但我认为"提供",代码的额外开销,jar更少.
| 归档时间: |
|
| 查看次数: |
17607 次 |
| 最近记录: |