Postgresql has done read-write separation, two servers, using the copy command to read csv files encountered difficulties, ask for advice

the environmental background of the problems and what methods you have tried

now the project is a node service. For the two nodes, postgresql does read-write separation, and the master node does write operations. Due to the large amount of data, you need to save the data as a csv file and then use the COPY command to insert in batches. The problem now is that the front-end request will be indefinitely sent to one of the two servers, where the data will be generated into csv files, but postgresql will only do write operations on the master node. Therefore, it will lead to that if the csv file on the slave node, postgresql runs the copy command, the csv file will not be found. I would like to ask you how to solve it. You can"t find a solution even though you have found the information. Please ask for advice

.

related codes

node uses postgresql matched with Sequelize

let sequelize = new Sequelize(`${postgresql.database}`,null , null, {
    "port": postgresql.port,
    "dialect": "postgres",
    "protocol": "postgres",
    "quoteIdentifiers": true,
    "logging": false,
    // 
    "replication": {
        "read": [
            { host: postgresql.masterHost, username: postgresql.username, password: postgresql.password },
            { host: postgresql.subHost, username: postgresql.username, password: postgresql.password }
        ],
        "write": { host: postgresql.masterHost, username: postgresql.username, password: postgresql.password }
    },
    "pool": {
        "maxConnections": process.env.NODE_ENV !== "PRODUCTION" ?  80 : 120,
        "minConnections": 0,
        "maxIdleTime": 30000
    }
});

undefined


psql copy\copy,\copycsvcopynodepythonpsycopg2copy_from
Menu