我试图收集我的rdd时,我开始收到以下错误.它发生在我安装Java 10.1之后所以当然我把它取出并重新安装它,同样的错误.然后我安装了Java 9.04同样的错误.然后我撕掉了python 2.7.14,apache spark 2.3.0和Hadoop 2.7,同样的错误.有没有人有任何其他原因导致我不断收到错误?
>>> from operator import add
>>> from pyspark import SparkConf, SparkContext
>>> import string
>>> import sys
>>> import re
>>>
>>> sc = SparkContext(appName="NEW")
2018-04-21 22:28:45 WARN Utils:66 - Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
>>> rdd = sc.parallelize(xrange(1,10))
>>> new =rdd.collect()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\pyspark\rdd.py", line 824, in collect
port = self.ctx._jvm.PythonRDD.collectAndServe(self._jrdd.rdd())
File "C:\spark\spark-2.3.0-bin-hadoop2.7\python\lib\py4j-0.10.6-src.zip\py4j\java_gateway.py", line 1160, in …
Run Code Online (Sandbox Code Playgroud)