How to use python protocol to implement concurrent requests?

for example, there are 10000 images, each of which contains a lot of text. You need to access the OCR API to parse the text on the image. The response time of the API is 2s.
how to use python protocol to implement concurrent requests?

Jan.25,2022

import aiohttp
import asyncio
import aiofiles


async def foo(filename):
    async with aiofiles.open(filename, 'rb') as f:
        content = await f.read()

    async with aiohttp.ClientSession() as session:
        async with session.post(url='http://httpbin.org/post', data=content):
            pass


if __name__ == "__main__":
    filenames = list()
    loop = asyncio.get_event_loop()
    tasks = [foo(filename=filename) for filename in filenames]
    loop.run_until_complete(asyncio.wait(tasks))
    loop.close()

of course, you can not only use aiohttp, if it is also a good choice in Python2,gevent.

Menu