我想使用 JDBC 从 Postgresql 读取数据并将其存储在 pyspark dataframe 中。当我想使用 df.show()、df.take() 等方法预览数据框中的数据时,它们返回一个错误,指出原因是:java.lang.ClassNotFoundException: org.postgresql.Driver。但是 df.printschema() 会完美地返回数据库表的信息。这是我的代码:
from pyspark.sql import SparkSession
spark = (
SparkSession.builder.master("spark://spark-master:7077")
.appName("read-postgres-jdbc")
.config("spark.driver.extraClassPath", "/opt/workspace/postgresql-42.2.18.jar")
.config("spark.executor.memory", "1g")
.getOrCreate()
)
sc = spark.sparkContext
df = (
spark.read.format("jdbc")
.option("driver", "org.postgresql.Driver")
.option("url", "jdbc:postgresql://postgres/postgres")
.option("table", 'public."ASSET_DATA"')
.option("dbtable", _select_sql)
.option("user", "airflow")
.option("password", "airflow")
.load()
)
df.show(1)
Run Code Online (Sandbox Code Playgroud)
错误日志:
Py4JJavaError: An error occurred while calling o44.showString.
: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 …Run Code Online (Sandbox Code Playgroud)