SparkSession 和上下文混淆

Kra*_*tos 5 python save apache-spark apache-spark-mllib

我有一个 pyspark 2.0.0 脚本,定义了以下会话:

spark = SparkSession \
    .builder \
    .appName("Python Spark") \
    .master("local[*]")\
    .config("spark.some.config.option", "some-value") \
    .getOrCreate()
Run Code Online (Sandbox Code Playgroud)

我训练了一个随机森林模型,我想保存它。因此我调用以下方法:

model_rf.save( spark, "/home/Desktop")
Run Code Online (Sandbox Code Playgroud)

但它会抛出以下编译错误:

TypeError: sc should be a SparkContext, got type <class 'pyspark.sql.session.SparkSession'>
Run Code Online (Sandbox Code Playgroud)

当我定义 Spark 上下文时,如下所示:

from pyspark import SparkContext
sc =SparkContext()
model_rf.save( sc, "/home/Desktop")
Run Code Online (Sandbox Code Playgroud)

我收到错误:

Cannot run multiple SparkContexts at once; existing SparkContext(app=Python Spark, master=local[*]) created by getOrCreate at <ipython-input-1-c5f83810f880>:24 
Run Code Online (Sandbox Code Playgroud)

mrs*_*vas 6

使用spark.sparkContextSparkSession对象将有sparkContext

model_rf.save( spark.sparkContext, "/home/Desktop")
Run Code Online (Sandbox Code Playgroud)