小编Rod*_*tos的帖子

在Jupyter上使用Spark collect()的IllegalArgumentException

我有一个Jupyter 4.3.0,Python 3.6.3(Anaconda)和PySpark 2.2.1的设置.

通过Jupyter运行时,以下示例将失败:

sc = SparkContext.getOrCreate()

rdd = sc.parallelize(['A','B','C'])
rdd.collect()
Run Code Online (Sandbox Code Playgroud)

下面是完整的堆栈跟踪:

---------------------------------------------------------------------------
Py4JJavaError                             Traceback (most recent call last)
<ipython-input-35-0d4a2ca9edf4> in <module>()
      2 
      3 rdd = sc.parallelize(['A','B','C'])
----> 4 rdd.collect()

/usr/local/Cellar/apache-spark/2.2.1/libexec/python/pyspark/rdd.py in collect(self)
    807         """
    808         with SCCallSiteSync(self.context) as css:
--> 809             port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
    810         return list(_load_from_socket(port, self._jrdd_deserializer))
    811 

/usr/local/Cellar/apache-spark/2.2.1/libexec/python/lib/py4j-0.10.4-src.zip/py4j/java_gateway.py in __call__(self, *args)
   1131         answer = self.gateway_client.send_command(command)
   1132         return_value = get_return_value(
-> 1133             answer, self.gateway_client, self.target_id, self.name)
   1134 
   1135         for temp_arg in temp_args:

/usr/local/Cellar/apache-spark/2.2.1/libexec/python/pyspark/sql/utils.py in deco(*a, **kw)
     61 …
Run Code Online (Sandbox Code Playgroud)

pyspark jupyter python-3.6

4
推荐指数
1
解决办法
4932
查看次数

标签 统计

jupyter ×1

pyspark ×1

python-3.6 ×1