Nis*_*ant 20 python multiprocessing
在Python中,我看到了许多调用多处理的例子,但目标只是打印一些东西.我有一个场景,目标返回2个变量,我需要稍后使用.例如:
def foo(some args):
a = someObject
b = someObject
return a,b
p1=multiprocess(target=foo,args(some args))
p2=multiprocess(target=foo,args(some args))
p3=multiprocess(target=foo,args(some args))
Run Code Online (Sandbox Code Playgroud)
怎么办?我可以.start和.join,但我如何检索单个结果?我需要为我执行的所有工作捕获返回a,b,然后继续工作.
Mik*_*rns 27
您正在寻找使用多个进程进行一些令人尴尬的并行工作...那么为什么不使用Pool?A Pool将负责启动流程,检索结果并将结果返回给您.我在这里使用pathos,它有一个fork multiprocessing,因为它比标准库提供的版本具有更好的序列化.
from pathos.multiprocessing import ProcessingPool as Pool
def foo(obj1, obj2):
a = obj1.x**2
b = obj2.x**2
return a,b
class Bar(object):
def __init__(self, x):
self.x = x
Pool().map(foo, [Bar(1),Bar(2),Bar(3)], [Bar(4),Bar(5),Bar(6)])
Run Code Online (Sandbox Code Playgroud)
并且您看到它foo接受两个参数,并返回两个对象的元组.所述map的方法Pool的提交foo到下面的处理,并返回其结果作为res.
你可以到pathos这里:https://github.com/uqfoundation
Eli*_*sky 20
是的,当然 - 您可以使用多种方法.其中一个最容易的是共享Queue.请参阅此处的示例:http://eli.thegreenplace.net/2012/01/16/python-parallelizing-cpu-bound-tasks-with-multiprocessing/
我正在直接从文档中复制这个例子,因为我无法直接链接到它.请注意,它会打印出done_queue的结果,但您可以随意执行任何操作.
#
# Simple example which uses a pool of workers to carry out some tasks.
#
# Notice that the results will probably not come out of the output
# queue in the same in the same order as the corresponding tasks were
# put on the input queue. If it is important to get the results back
# in the original order then consider using `Pool.map()` or
# `Pool.imap()` (which will save on the amount of code needed anyway).
#
# Copyright (c) 2006-2008, R Oudkerk
# All rights reserved.
#
import time
import random
from multiprocessing import Process, Queue, current_process, freeze_support
#
# Function run by worker processes
#
def worker(input, output):
for func, args in iter(input.get, 'STOP'):
result = calculate(func, args)
output.put(result)
#
# Function used to calculate result
#
def calculate(func, args):
result = func(*args)
return '%s says that %s%s = %s' % \
(current_process().name, func.__name__, args, result)
#
# Functions referenced by tasks
#
def mul(a, b):
time.sleep(0.5*random.random())
return a * b
def plus(a, b):
time.sleep(0.5*random.random())
return a + b
#
#
#
def test():
NUMBER_OF_PROCESSES = 4
TASKS1 = [(mul, (i, 7)) for i in range(20)]
TASKS2 = [(plus, (i, 8)) for i in range(10)]
# Create queues
task_queue = Queue()
done_queue = Queue()
# Submit tasks
for task in TASKS1:
task_queue.put(task)
# Start worker processes
for i in range(NUMBER_OF_PROCESSES):
Process(target=worker, args=(task_queue, done_queue)).start()
# Get and print results
print 'Unordered results:'
for i in range(len(TASKS1)):
print '\t', done_queue.get()
# Add more tasks using `put()`
for task in TASKS2:
task_queue.put(task)
# Get and print some more results
for i in range(len(TASKS2)):
print '\t', done_queue.get()
# Tell child processes to stop
for i in range(NUMBER_OF_PROCESSES):
task_queue.put('STOP')
if __name__ == '__main__':
freeze_support()
test()
Run Code Online (Sandbox Code Playgroud)
它最初来自多处理模块docs.
为什么没有人使用multiprocessing.Pool 的回调?
例子:
from multiprocessing import Pool
from contextlib import contextmanager
from pprint import pprint
from requests import get as get_page
@contextmanager
def _terminating(thing):
try:
yield thing
finally:
thing.terminate()
def _callback(*args, **kwargs):
print("CALBACK")
pprint(args)
pprint(kwargs)
print("Processing...")
with _terminating(Pool(processes=WORKERS)) as pool:
results = pool.map_async(get_page, URLS, callback=_callback)
start_time = time.time()
results.wait()
end_time = time.time()
print("Time for Processing: %ssecs" % (end_time - start_time))
Run Code Online (Sandbox Code Playgroud)
在这里,我打印了 args 和 kwargs。但是您可以通过以下方式替换回调:
def _callback2(responses):
for r in responses:
print(r.status_code) # or do whatever with response...
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
60137 次 |
| 最近记录: |