如何在 PySpark 中导入 AnalysisException

Myk*_*tko 10 python exception try-catch apache-spark pyspark

我找不到如何AnalysisException在 PySpark 中导入,所以我可以抓住它。例如:

df = spark.createDataFrame([[1, 2], [1, 2]], ['A', 'A'])

try:
  df.select('A')
except AnalysisException as e:
  print(e)
Run Code Online (Sandbox Code Playgroud)

错误信息:

NameError: name 'AnalysisException' is not defined
Run Code Online (Sandbox Code Playgroud)

mck*_*mck 14

您可以在这里导入它:

from pyspark.sql.utils import AnalysisException
Run Code Online (Sandbox Code Playgroud)

这显示在错误回溯中,例如

Traceback (most recent call last):
  ...
  File "<string>", line 3, in raise_from
pyspark.sql.utils.AnalysisException: cannot resolve ...
Run Code Online (Sandbox Code Playgroud)