Den*_*lll 53 python attributeerror pandas apache-spark pyspark
我在 AWS EMR 上使用 pyspark(4 个 r5.xlarge 作为 4 个工作线程,每个工作线程有 1 个执行程序和 4 个核心),并且我得到了AttributeError: Can't get attribute 'new_block' on <module 'pandas.core.internals.blocks'
. 下面是引发此错误的代码片段:
search = SearchEngine(db_file_dir = "/tmp/db")
conn = sqlite3.connect("/tmp/db/simple_db.sqlite")
pdf_ = pd.read_sql_query('''select zipcode, lat, lng,
bounds_west, bounds_east, bounds_north, bounds_south from
simple_zipcode''',conn)
brd_pdf = spark.sparkContext.broadcast(pdf_)
conn.close()
@udf('string')
def get_zip_b(lat, lng):
pdf = brd_pdf.value
out = pdf[(np.array(pdf["bounds_north"]) >= lat) &
(np.array(pdf["bounds_south"]) <= lat) &
(np.array(pdf['bounds_west']) <= lng) &
(np.array(pdf['bounds_east']) >= lng) ]
if len(out):
min_index = np.argmin( (np.array(out["lat"]) - lat)**2 + (np.array(out["lng"]) - lng)**2)
zip_ = str(out["zipcode"].iloc[min_index])
else:
zip_ = 'bad'
return zip_
df = df.withColumn('zipcode', get_zip_b(col("latitude"),col("longitude")))
Run Code Online (Sandbox Code Playgroud)
下面是回溯,其中 get_zip_b 中的第 102 行引用pdf = brd_pdf.value
:
21/08/02 06:18:19 WARN TaskSetManager: Lost task 12.0 in stage 7.0 (TID 1814, ip-10-22-17-94.pclc0.merkle.local, executor 6): org.apache.spark.api.python.PythonException: Traceback (most recent call last):
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 605, in main
process()
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 597, in process
serializer.dump_stream(out_iter, outfile)
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/serializers.py", line 223, in dump_stream
self.serializer.dump_stream(self._batched(iterator), stream)
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/serializers.py", line 141, in dump_stream
for obj in iterator:
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/serializers.py", line 212, in _batched
for item in iterator:
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 450, in mapper
result = tuple(f(*[a[o] for o in arg_offsets]) for (arg_offsets, f) in udfs)
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 450, in <genexpr>
result = tuple(f(*[a[o] for o in arg_offsets]) for (arg_offsets, f) in udfs)
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/worker.py", line 90, in <lambda>
return lambda *a: f(*a)
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/util.py", line 121, in wrapper
return f(*args, **kwargs)
File "/mnt/var/lib/hadoop/steps/s-1IBFS0SYWA19Z/Mobile_ID_process_center.py", line 102, in get_zip_b
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/broadcast.py", line 146, in value
self._value = self.load_from_path(self._path)
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/broadcast.py", line 123, in load_from_path
return self.load(f)
File "/mnt/yarn/usercache/hadoop/appcache/application_1627867699893_0001/container_1627867699893_0001_01_000009/pyspark.zip/pyspark/broadcast.py", line 129, in load
return pickle.load(file)
AttributeError: Can't get attribute 'new_block' on <module 'pandas.core.internals.blocks' from '/mnt/miniconda/lib/python3.9/site-packages/pandas/core/internals/blocks.py'>
Run Code Online (Sandbox Code Playgroud)
一些观察和思考过程:
1、网上查了一下,pyspark中的AttributeError似乎是由driver和workers之间的pandas版本不匹配引起的?
2,但是我在两个不同的数据集上运行了相同的代码,一个工作没有任何错误,但另一个没有,这看起来非常奇怪和不确定,而且似乎错误可能不是由不匹配的 pandas 版本引起的。否则,两个数据集都不会成功。
3、然后我再次在成功的数据集上运行相同的代码,但这次使用不同的 Spark 配置:将 Spark.driver.memory 从 2048M 设置为 4192M,并抛出 AttributeError。
4、综上所述,我认为AttributeError与驱动程序有关。但我无法从错误消息中看出它们是如何相关的,以及如何修复它:AttributeError: Can't get attribute 'new_block' on <module 'pandas.core.internals.blocks'。
Sai*_*ibō 77
或者
用于转储pickle的pandas版本(dump_version ,可能是1.3.x)与用于加载pickle的pandas版本( load_version ,可能是1.2.x)不兼容。要解决这个问题,请尝试在加载环境中将pandas版本(load_version)升级到1.3.x,然后加载pickle。或者将 pandas 版本(dump_version)降级到 1.2.x,然后重新转储新的 pickle。之后,您可以使用版本 1.2.x 的 pandas 加载新的 pickle
这与 PySpark无关
此问题与Pandas版本1.2.x
和1.3.x
. 在该版本及之前的版本中,Pandas 使用模块cf源代码 v1.2.51.2.5
中的变量名称。2021 年 7 月 2 日,Pandas 发布了版本. 在本次更新中,Pandas 更改了 api,模块中的变量名称已更改为cf source code v1.3.0。new_blocks
pandas.core.internals.blocks
1.3.0
new_blocks
pandas.core.internals.blocks
new_block
API 的这一更改将导致两个不兼容错误:
AttributeError: Can't get attribute 'new_block' on <module 'pandas.core.internals.blocks' from '.../site-packages/pandas/core/internals/blocks.py'>'>
Python 抛出此错误,抱怨它找不到new_block
当前的属性pandas.core.internals.blocks
,因为为了 pickle 加载对象,它必须使用与转储 pickle 完全相同的类。
这正是你的情况:用 Pandas v1.3.x 转储 pickle 并尝试用 Pandas v1.2.x 加载 pickle
pip install --upgrade pandas==1.3.4
import numpy as np
import pandas as pd
df =pd.DataFrame(np.random.rand(3,6))
with open("dump_from_v1.3.4.pickle", "wb") as f:
pickle.dump(df, f)
quit()
Run Code Online (Sandbox Code Playgroud)
pip install --upgrade pandas==1.2.5
import pickle
with open("dump_from_v1.3.4.pickle", "rb") as f:
df = pickle.load(f)
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-2-ff5c218eca92> in <module>
1 with open("dump_from_v1.3.4.pickle", "rb") as f:
----> 2 df = pickle.load(f)
3
AttributeError: Can't get attribute 'new_block' on <module 'pandas.core.internals.blocks' from '/opt/anaconda3/lib/python3.7/site-packages/pandas/core/internals/blocks.py'>
Run Code Online (Sandbox Code Playgroud)
小智 7
我在这种情况下遇到了相同的 AttributeError ,如下所示:
归档时间: |
|
查看次数: |
98745 次 |
最近记录: |