小编Mad*_*olf的帖子

使用多进程和多线程提高asyncio"发送100,000请求"的速度

首先,我想尽快使用1个连接发送多个请求.下面的代码运行良好而快速,但我希望它超越异步.回到我的问题,是否可以使用多线程或多处理并行运行它.我听说你可以使用ThreadPoolExecutor或ProcessPoolExecutor.

import random
import asyncio
from aiohttp import ClientSession
import time
from concurrent.futures import ProcessPoolExecutor

async def fetch(sem,url, session):
    async with sem:
        async with session.get(url) as response:
            return await response.read()
async def run(r):
    url = "http://www.example.com/"
    tasks = []
    sem = asyncio.Semaphore(1000)
    async with ClientSession() as session:
        for i in range(r):
            task = asyncio.ensure_future(fetch(sem, url.format(i), session)) #return a task
            tasks.append(task)
        responses = asyncio.gather(*tasks)
        await responses
if __name__ == "__main__":
    number = 10000
    loop = asyncio.get_event_loop()
    start = time.time()
    loop.run_until_complete(run(number))
    end …
Run Code Online (Sandbox Code Playgroud)

python multithreading python-asyncio aiohttp

5
推荐指数
1
解决办法
724
查看次数

标签 统计

aiohttp ×1

multithreading ×1

python ×1

python-asyncio ×1