How to speed up the batch deletion of tables containing compound unique indexes by foreach of mybatis?

the project uses the SpringBoot framework, and the persistence layer uses the mybatis, database for the mysql, business table to contain a composite unique index of four fields. Batch insertion or deletion on this table requires no logical deletion. When the data deleted in batches reaches level 100000, the speed of batch deletion with foreach will become very slow due to the impact of the index. How should it be optimized? The implementation method in mapper is as follows:

<delete id="delete">
    delete from table
    where (a, b, c, d) in
    <foreach collection="list"  item="item" index="index" open="(" separator="," close=")">
        (-sharp{item.a},-sharp{item.b},-sharp{item.c},-sharp{item.d})
    </foreach>
</delete>

even if it is divided into small batches, it will take more than 1 minute to delete 10w data (adding a compound unique index on a recalcitrance brecinct crewd)

Dec.02,2021

add a mark on whether to delete the field if possible, and then delete it in batches with scheduled tasks


feel that there should be little room for optimization. If the SQL is too slow to affect the user experience, you can consider changing this operation to an asynchronous operation

.
Menu