Celery is distributed. If I have 100th worker running, will all these 100worker connect to the database?

I want to use celery to make a distributed crawler

Let celery grab the data and store it in my mysql database

but the mysql database I bought only has 50 connections (database provider limit)

so I can"t start 100 worker to grab data at the same time?

is there a way to get worker to grab the data and return it to master, so that master can store it in the database?

do you have a solution?

Thank you

Mar.19,2021

production data is stored in redis's queue (List). Then start a process of slow consumption.

Menu