Python redis is used for caching. Redis hset fails when data is entered into mysql.

problem description

when I use python3"s redis for caching, the structure is hset, but when the data reaches 10000, I need to store these 10000 pieces of data into the database. When I delete the hset, of redis, the new request is not added to the redis. It will be added to the redis only after all the 10000 pieces of data have been stored.

the environmental background of the problems and what methods you have tried

add a temporary variable in the middle as a transition (still invalid, or in the wrong way)

related codes

rds.hset(chn_idx, uid, data) -sharp if,mysql.
ualen = int(rds.hlen(chn_idx))
if ualen > 10000:
    keyData = rds.hgetall(chn_idx)
    rds.delete(chn_idx)
    for uid, infos in keyData.items():
        ...   -sharp mysql.

what result do you expect? What is the error message actually seen?

New requests can still be added to redis after

rds.delete (chn_idx). This hset only acts as an intermediate cache and ensures that data is not lost. What is the reason for the present situation? Is it that the data in hset is too large that delete takes time? Or is it something else?

May.31,2021

although I don't quite understand what you are trying to say, redis's pipeline can understand it and may be able to solve your problem
https://github.com/andymccurd.


found the problem. There is something wrong with my code, and I don't fully understand the asynchronism of tornado. As long as the delay operation is made asynchronous, this problem is solved.

Menu