Dmi*_*bin 11 apache-spark pyspark python-3.8
conda create -y -n py38 python=3.8
conda activate py38
Run Code Online (Sandbox Code Playgroud)
pip install pyspark
# Successfully installed py4j-0.10.7 pyspark-2.4.5
Run Code Online (Sandbox Code Playgroud)
python -c "import pyspark"
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/Users/dmitrii_deriabin/anaconda3/envs/py38/lib/python3.8/site-packages/pyspark/__init__.py", line 51, in <module>
from pyspark.context import SparkContext
File "/Users/dmitrii_deriabin/anaconda3/envs/py38/lib/python3.8/site-packages/pyspark/context.py", line 31, in <module>
from pyspark import accumulators
File "/Users/dmitrii_deriabin/anaconda3/envs/py38/lib/python3.8/site-packages/pyspark/accumulators.py", line 97, in <module>
from pyspark.serializers import read_int, PickleSerializer
File "/Users/dmitrii_deriabin/anaconda3/envs/py38/lib/python3.8/site-packages/pyspark/serializers.py", line 72, in <module>
from pyspark import cloudpickle
File "/Users/dmitrii_deriabin/anaconda3/envs/py38/lib/python3.8/site-packages/pyspark/cloudpickle.py", line 145, in <module>
_cell_set_template_code = _make_cell_set_template_code()
File "/Users/dmitrii_deriabin/anaconda3/envs/py38/lib/python3.8/site-packages/pyspark/cloudpickle.py", line 126, in _make_cell_set_template_code
return types.CodeType(
TypeError: an integer is required (got type bytes)
Run Code Online (Sandbox Code Playgroud)
似乎 Pyspark 带有预打包版本的cloudpickle包,该版本在 Python 3.8 上存在一些问题,现在已在 pip 版本上解决(至少从 1.3.0 版开始),但是 Pyspark 版本仍然存在问题。有没有人遇到同样的问题/有没有运气解决这个问题?
| 归档时间: |
|
| 查看次数: |
9725 次 |
| 最近记录: |