1 hadoop apache-spark apache-spark-sql
我试图使用下面的代码使用Apache Spark中提供的sqlcontext查询存储在hdfs中的文件,但我得到一个NoSuchMethodError
package SQL
import org.apache.spark.SparkContext
import org.apache.spark.sql._
object SparSQLCSV { def main(args: Array[String]) {
val sc = new SparkContext("local[*]","home")
val sqlContext = new org.apache.spark.sql.SQLContext(sc)
val people = sc.textFile("/home/devan/Documents/dataset/peoplesTest.csv")
val delimiter = ","
val schemaString = "a,b".split(delimiter)//csv header
//Automated Schema creation
val schema = StructType(schemaString.map(fieldName => StructField(fieldName, StringType, true)))
val peopleLines = people.flatMap(x=> x.split("\n"))
val rowRDD = peopleLines.map(p=>{
Row.fromSeq(p.split(delimiter))
})
val peopleSchemaRDD = sqlContext.applySchema(rowRDD, schema)
peopleSchemaRDD.registerTempTable("people")
sqlContext.sql("SELECT b FROM people").foreach(println)
} }
Run Code Online (Sandbox Code Playgroud)
线程"main"中的异常java.lang.NoSuchMethodError:org.apache.spark.sql.SQLContext.applySchema(Lorg/apache/spark/rdd/RDD; Lorg/apache/spark/sql/types/StructType;)Lorg/apache /火花/ SQL /数据帧; at scalding.Main_Obj $ .main(Main_Obj.scala:34)at scalding.Main_Obj.main(Main_Obj.scala)at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 57)在org.apache.spark.deploy.SparkSubmit $ .launch(SparkSubmit.)的java.lang.reflect.Method.invoke(Method.java:606)的sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43). scala:358)org.apache.spark.deploy.SparkSubmit $ .main(SparkSubmit.scala:75)at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
我已尝试使用spark中提供的相同命令行,它可以正常工作,但是当我创建一个scala项目并尝试运行它时,我得到了上述错误.我究竟做错了什么 ?
| 归档时间: |
|
| 查看次数: |
3883 次 |
| 最近记录: |