一个先前的问题建议sc.applicationId,但它不存在于PySpark,只有在scala.
sc.applicationId
PySpark
scala
那么,我如何yarn计算我的PySpark进程的应用程序ID(for )?
yarn
hadoop-yarn apache-spark pyspark
apache-spark ×1
hadoop-yarn ×1
pyspark ×1