How to write JSON column type to Postgres with PySpark?

Ric*_*ang 5 postgresql jdbc pyspark pyspark-sql

I have a Postgresql table that has a column with data type JSONB.

How do I insert DataFrame to the Postgresql table via JDBC?

If I have a UDF to convert the the body column to the JSONB Postgresql data type, what is the corresponding pyspark.sql.types should I use?

Postgresql Table with a JSONB column:

CREATE TABLE dummy (
  id bigint,
  body JSONB
);

Thanks!

Ric*_*ang 6

原来,如果我将其设置"stringtype":"unspecified"为JDBC的属性,Postgres将自动转换:

    属性= {
        “用户”:“ ***”,
        “密码”:“ ***”,
        “ stringtype”:“未指定”
    }
    df.write.jdbc(URL = URL,table =“ dummy”,properties = properties)

  • 我尝试了这个,但仍然收到错误,尽管略有不同(抱怨类型是“字符”而不是“字符变化”)。这似乎是由于 NULL 值的存在引起的。 (2认同)