Ama*_*man 6 scala sbt apache-spark apache-spark-sql
我正在创建一个SQLContext
使用sbt 的Scala程序。这是我的build.sbt:
name := "sampleScalaProject"
version := "1.0"
scalaVersion := "2.11.7"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2"
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"
Run Code Online (Sandbox Code Playgroud)
这是测试程序:
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object SqlContextSparkScala {
def main (args: Array[String]) {
val sc = SparkContext
val sqlcontext = new SQLContext(sc)
}
}
Run Code Online (Sandbox Code Playgroud)
我得到以下错误:
Error:(8, 26) overloaded method constructor SQLContext with alternatives:
(sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext <and>
(sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
cannot be applied to (org.apache.spark.SparkContext.type)
val sqlcontexttest = new SQLContext(sc)
Run Code Online (Sandbox Code Playgroud)
谁能让我知道这个问题,因为我是scala和spark编程的新手?
Sha*_*ica 10
对于较新版本的 Spark (2.0+),请使用SparkSession
:
val spark = SparkSession.builder.getOrCreate()
Run Code Online (Sandbox Code Playgroud)
SparkSession
可以做所有SQLContext
可以做的事情,但如果需要,SQLContext
可以按如下方式访问,
val sqlContext = spark.sqlContext
Run Code Online (Sandbox Code Playgroud)
归档时间: |
|
查看次数: |
19236 次 |
最近记录: |