[Python2.7] tornado communication between multiple sub-processes

Python version: 2.7

now you need to use tornado to implement a function like this:

1. Create multiple processes using tornado.
2. The method of reading configuration information from the database is written in the process, and the read configuration information will be stored in an A variable in the process.
3. It is written in the process that takes the corresponding operation according to the configuration information, where the configuration information is extracted from the A variable.
4. Because it is time-consuming to connect to the database, the configuration information is stored in the variable so that the configuration information can be obtained faster next time. The configuration information in the database will be modified (but the frequency is uncertain, maybe once a half day or once a day, so you are not going to use scheduled task Synchronize configuration information). When other services modify the configuration information, they will call the API method of reading configuration information from the database .

current problems:

suppose there are 10 processes running, and after other services have modified the configuration information, the request will be sent only once, and only one process will compete to call the method of reading configuration information from the database, but the remaining 9 will not call this method. Because it is processed by multiple processes, the A variable of each process is different, so the A variable of one process stores the configuration information of half an hour ago, but the A variable of the other process stores the configuration information that has just been updated.
is there any way to make the A variable of each process Synchronize up?

proposed solution:

1. Using pipeline (Pipe) communication, when one process receives a call request for to read configuration information from the database, it sends a message to other processes through the pipeline, making them all call the method of reading configuration information from the database once. But the pipeline communication seems to be blocked, and a message that is sent does not exist after it is consumed, that is, only one process can receive this notification. Invalid ~
2. Event mode, inspired by ioloop.add_handler (). But google for a while, it seems that only ioloop has this method, and I don"t know if it is possible to add a listening event to the process.

tried for 2 days, but there is no good solution, so come here to ask for advice ~
Thank you ~

Menu