使用SparkSession或sqlcontext时出错

Muk*_*d S 2 scala apache-spark apache-spark-sql spark-dataframe

我是新来的火花。我只是想使用sparksession或sqlcontext解析json文件。但是,每当我运行它们时,都会出现以下错误。

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.internal.config.package$.CATALOG_IMPLEMENTATION()Lorg/apache/spark/internal/config/ConfigEntry; at
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$sessionStateClassName(SparkSession.scala:930) at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:112) at
org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:110)  at 
org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:535)  at 
org.apache.spark.sql.SparkSession.read(SparkSession.scala:595) at
org.apache.spark.sql.SQLContext.read(SQLContext.scala:504) at
joinAssetsAndAd$.main(joinAssetsAndAd.scala:21) at
joinAssetsAndAd.main(joinAssetsAndAd.scala)
Run Code Online (Sandbox Code Playgroud)

到目前为止,我在eclipse IDE中创建了一个scala项目,并将其配置为Maven项目,并添加了spark和sql依赖项。

我的依赖:

Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.internal.config.package$.CATALOG_IMPLEMENTATION()Lorg/apache/spark/internal/config/ConfigEntry; at
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$sessionStateClassName(SparkSession.scala:930) at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:112) at
org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:110)  at 
org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:535)  at 
org.apache.spark.sql.SparkSession.read(SparkSession.scala:595) at
org.apache.spark.sql.SQLContext.read(SQLContext.scala:504) at
joinAssetsAndAd$.main(joinAssetsAndAd.scala:21) at
joinAssetsAndAd.main(joinAssetsAndAd.scala)
Run Code Online (Sandbox Code Playgroud)

您能否解释为什么我会收到此错误以及如何纠正它们?

L. *_*CWI 10

尝试对spark-core和spark-sql使用相同的版本。将spark-sql的版本更改为2.1.0