Bha*_*Das 5 scala sbt apache-spark
我想使用sbt部署并提交一个spark程序,但是它会抛出错误。
码:
package in.goai.spark
import org.apache.spark.{SparkContext, SparkConf}
object SparkMeApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("First Spark")
val sc = new SparkContext(conf)
val fileName = args(0)
val lines = sc.textFile(fileName).cache
val c = lines.count
println(s"There are $c lines in $fileName")
}
}
Run Code Online (Sandbox Code Playgroud)
build.sbt
name := "First Spark"
version := "1.0"
organization := "in.goai"
scalaVersion := "2.11.8"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.6.1"
resolvers += Resolver.mavenLocal
Run Code Online (Sandbox Code Playgroud)
在第一/项目目录下
build.properties
bt.version=0.13.9
Run Code Online (Sandbox Code Playgroud)
当我尝试运行sbt package下面给出的抛出错误时。
[root@hadoop first]# sbt package
[info] Loading project definition from /home/training/workspace_spark/first/project
[info] Set current project to First Spark (in build file:/home/training/workspace_spark/first/)
[info] Compiling 1 Scala source to /home/training/workspace_spark/first/target/scala-2.11/classes...
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:3: object apache is not a member of package org
[error] import org.apache.spark.{SparkContext, SparkConf}
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:9: not found: type SparkConf
[error] val conf = new SparkConf().setAppName("First Spark")
[error] ^
[error] /home/training/workspace_spark/first/src/main/scala/LineCount.scala:11: not found: type SparkContext
[error] val sc = new SparkContext(conf)
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 4 s, completed May 10, 2018 4:05:10 PM
Run Code Online (Sandbox Code Playgroud)
我试图与extends以App太多,但没有改变。
请resolvers += Resolver.mavenLocal从 中删除build.sbt。由于spark-core在 Maven 上可用,因此我们不需要使用本地解析器。
之后,你可以尝试一下sbt clean package。
| 归档时间: |
|
| 查看次数: |
1563 次 |
| 最近记录: |