hun*_*nch 6 python arrays parallel-processing numpy
我在python中定义一个函数.程序文件名本身是abc_d.py.我不明白我是否可以再次导入同一个文件.
import numpy as np
import matplotlib.pyplot as plt
import sys
import multiprocessing
num_processor=4
pool = multiprocessing.Pool(num_processor)
def abc(data):
w=np.dot(data.reshape(25,1),data.reshape(1,25))
return w
data_final=np.array(range(100))
n=100
error=[]
k_list=[50,100,500,1000,2000]
for k in k_list:
dict_data={}
for d_set in range(num_processor):
dict_data[d_set]=data_final[int(d_set*n/4):int((d_set+1)*n/4)]
if(d_set==num_processor-1):
dict_data[d_set]=data_final[int(d_set*n/4):]
tasks = dict_data
results_w=[pool.apply_async(abc,dict_data[t]) for t in range(num_processor)]
w_f=[]
for result in results_w:
w_s=result.get()
w_f.append(w_s.tolist())
w_f=np.array(w_f)
print (w_f)
Run Code Online (Sandbox Code Playgroud)
其中tasks是带数组的字典.
错误:
任何人都可以解释错误.我对python还不太熟悉.
Process ForkPoolWorker-1:
Process ForkPoolWorker-2:
Process ForkPoolWorker-3:
Process ForkPoolWorker-4:
Traceback (most recent call last):
Traceback (most recent call last):
File "/home/anaconda3/lib/python3.5/multiprocessing/process.py", line 254, in _bootstrap
self.run()
File "/home/anaconda3/lib/python3.5/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
File "/home/anaconda3/lib/python3.5/multiprocessing/pool.py", line 108, in worker
task = get()
File "/home/anaconda3/lib/python3.5/multiprocessing/queues.py", line 345, in get
return ForkingPickler.loads(res)
File "/home/anaconda3/lib/python3.5/multiprocessing/process.py", line 254, in _bootstrap
self.run()
File "/home/anaconda3/lib/python3.5/multiprocessing/process.py", line 93, in run
self._target(*self._args, **self._kwargs)
AttributeError: Can't get attribute 'abc' on <module '__main__' from 'abc_d.py'>
Run Code Online (Sandbox Code Playgroud)
mnk*_*00n 12
如果在声明要尝试并行使用的函数之前声明池,则会抛出此错误.颠倒顺序,它将不再抛出此错误.此外,您的代码中存在一个错误,当您要将其作为列表提供时,您将所有data_dict提供给abc.所以我也改变了这一行,它返回了一些结果.
import numpy as np
import matplotlib.pyplot as plt
import sys
import multiprocessing
num_processor=4
def abc(data):
w=np.dot(data.reshape(25,1),data.reshape(1,25))
return w
pool = multiprocessing.Pool(num_processor)
data_final=np.array(range(100))
n=100
error=[]
k_list=[50,100,500,1000,2000]
for k in k_list:
dict_data={}
for d_set in range(num_processor):
dict_data[d_set]=data_final[int(d_set*n/4):int((d_set+1)*n/4)]
if(d_set==num_processor-1):
dict_data[d_set]=data_final[int(d_set*n/4):]
tasks = dict_data
results_w=[pool.apply_async(abc, [dict_data[t]]) for t in range(num_processor)]
w_f=[]
for result in results_w:
w_s=result.get()
w_f.append(w_s.tolist())
w_f=np.array(w_f)
print (w_f)
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
9091 次 |
| 最近记录: |