小编Sau*_*abh的帖子

Spark-shell 命令抛出此错误:SparkContext:初始化 SparkContext 时出错

Spark 版本:3.2.0 Java 版本:8 Python 版本:3.7.3 Scala:sbt-1.5.5.msi

我按照此链接执行了所有步骤:https://phoenixnap.com/kb/install-spark-on-windows-10

当我通过cmd运行spark-shell命令时,它抛出以下错误:

C:\spark-3.2.0-bin-hadoop3.2\bin>spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
21/11/11 00:14:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/11/11 00:14:26 ERROR SparkContext: Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
        at java.lang.reflect.Constructor.newInstance(Unknown Source)
        at org.apache.spark.executor.Executor.addReplClassLoaderIfNeeded(Executor.scala:909)
        at org.apache.spark.executor.Executor.<init>(Executor.scala:160)
        at org.apache.spark.scheduler.local.LocalEndpoint.<init>(LocalSchedulerBackend.scala:64)
        at org.apache.spark.scheduler.local.LocalSchedulerBackend.start(LocalSchedulerBackend.scala:132)
        at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:220) …
Run Code Online (Sandbox Code Playgroud)

apache-spark

6
推荐指数
1
解决办法
2万
查看次数

标签 统计

apache-spark ×1