小编Rah*_*ath的帖子

Windows Spark 错误 java.lang.NoClassDefFoundError:无法初始化类 org.apache.spark.storage.StorageUtils

下载了最新的apache 3.2.0以及hadoop文件java Java SE Development Kit 17.0.1也安装了

我什至无法初始化

输入 :

import pyspark
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
df = spark.sql('''select 'spark' as hello ''')
df.show()
Run Code Online (Sandbox Code Playgroud)

输出#

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.storage.StorageUtils$
at org.apache.spark.storage.BlockManagerMasterEndpoint.<init>(BlockManagerMasterEndpoint.scala:110)
at org.apache.spark.SparkEnv$.$anonfun$create$9(SparkEnv.scala:348)
at org.apache.spark.SparkEnv$.registerOrLookupEndpoint$1(SparkEnv.scala:287)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:336)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:191)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
Run Code Online (Sandbox Code Playgroud)

pyspark

6
推荐指数
1
解决办法
1万
查看次数

标签 统计

pyspark ×1