How can aiohttp asynchronously request pages be stored in the database for every x pages requested?

the following is my code, the function is to request a web page, extract name, from the web page and store it in the database,

the page I want to request has 10w,

I want to achieve every 50 pages. Automatically save it to the database (that is, execute session.commit ()),

)

but I have no idea how to do it,

before using request, I also know to add a variable to control the number of cycles,

but with aiohttp, I couldn"t figure it out. I got down on my knees and begged the gods for advice


async def get(x):
    async with aiohttp.ClientSession() as session:
        url = f"https://httpbin.org/anything?name={x}"
        async with session.get(url) as resp:
            text1 = await resp.text()
            text1_json = json.loads(text1)
            return text1_json["args"]
            
async def main1(x):
    new_name = await get(x)  -sharpname

    -sharp 
    user11 = User_table(name=new_name,  created=datetime.datetime.now())  -sharp sqlalchemy
    session.add(user11)  -sharp 


pages = range(1, 100000)

tasks = [asyncio.ensure_future(main1(x)) for x in pages]

loop = asyncio.get_event_loop()

loop.run_until_complete(asyncio.wait(tasks))


session.commit()  -sharp 

print ("work done !!!!")
Nov.28,2021

submit 50 task, at a time and continue to submit 50 after completion.


the same, define a variable N , add Number1 , add then check the N value, and just commit in place

.
Menu