小编dat*_*chu的帖子

尝试从 Spark 在 S3 存储桶上写入数据时找不到类 org.apache.hadoop.fs.s3a.auth.IAMInstanceCredentialsProvider

我正在尝试从本地计算机将数据写入 S3 存储桶:

spark = SparkSession.builder \
    .appName('application') \
    .config("spark.hadoop.fs.s3a.access.key", configuration.AWS_ACCESS_KEY_ID) \
    .config("spark.hadoop.fs.s3a.secret.key", configuration.AWS_ACCESS_SECRET_KEY) \
    .config("spark.hadoop.fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem") \
    .getOrCreate()

lines = spark.readStream \
    .format('kafka') \
    .option('kafka.bootstrap.servers', kafka_server) \
    .option('subscribe', kafka_topic) \
    .option("startingOffsets", "earliest") \
    .load()

streaming_query = lines.writeStream \
                    .format('parquet') \
                    .outputMode('append') \
                    .option('path', configuration.S3_PATH) \
                    .start()

streaming_query.awaitTermination()
Run Code Online (Sandbox Code Playgroud)

Hadoop版本:3.2.1,Spark版本3.2.1

我已将依赖项 jar 添加到 pyspark jar 中:

Spark-sql-kafka-0-10_2.12:3.2.1、aws-java-sdk-s3:1.11.375、hadoop-aws:3.2.1、

执行时出现以下错误:

py4j.protocol.Py4JJavaError: An error occurred while calling o68.start.
: java.io.IOException: From option fs.s3a.aws.credentials.provider 
java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.s3a.auth.IAMInstanceCredentialsProvider not found
Run Code Online (Sandbox Code Playgroud)

hadoop amazon-s3 apache-spark spark-streaming pyspark

11
推荐指数
1
解决办法
2万
查看次数