JS event loop problem

my requirements:

 
 chunksSizeN = file.size/chunkSize
 NNhttp.requestPost
 
The

code is as follows
(note: the upload function is used to calculate the start flag bit and the end flag bit of each block according to the number of blocks n, and call the senddataPromise function to operate on each piece)

function  upload(username,filepath,file_id,filelength,n,alreadychunks,chunkSize) {
    return new Promise(function (resolve,reject) {
            var start = 0,end = 0;
            var promiseall = [];
            for (let curindex = 0;curindex < n;curindexPP) {
                if(filelength - start <= chunkSize) {
                    end  =  filelength - 1;
                }else {
                    end = start+chunkSize - 1; // startend
                }
                if(alreadychunks.indexOf(curindex) == -1) {
                    let options = {
                        flags: "r",
                        highWaterMark: chunkSize,
                        start: start,
                        end: end
                    };
                    promiseall.push(senddataPromise(filepath,options,username,file_id,curindex,end-start+1));
                }
                start = end + 1;
            }
            let timer = setInterval(() => {
                if(promiseall.length == n) {
                    clearInterval(timer);
                    Promise.all(promiseall).then(values=>{
                        console.log(values);
                        console.log("all done");
                        resolve(true)
                    }).catch(err => {
                        console.log(err);
                        reject(err);
                    })
                }
            },500)
    })
}
The

senddataPromise function creates a read stream to read the contents of block I, and calls the doapost function to send it to the backend

.
function senddataPromise(path,options,username,summary,curindex,length) {
    return new Promise(function (resolve,reject) {
        let readSteam = fs.createReadStream(path,options);
        readSteam.on("data",(chunk) => {
            console.log(""+curindex+" JSON")
            let chunkjson = JSON.stringify(chunk);
            console.log(""+curindex+" JSON")
            let tempcell = {
                data: chunkjson,
                n: curindex,
                file_id: summary,
                username: username,
                length: length
            };
            chunk = null;
            chunkjson = null;
            doapost(tempcell).then(values=>{
                resolve(values)
            }).catch(err=>{
                reject(err);
            });
        })
    })
}

doapost function initiates a post request to send multipart data

function  doapost(data) {
    return new Promise(function (resolve,reject) {
        let i = data.n;
        console.log(""+i+"")
        let contents = queryString.stringify(data);
        data = null;
        let options = {
            host: "localhost",
            path: "/nodepost/",
            port: 8000,
            method: "POST",
            headers: {
                "Content-Type": "application/x-www-form-urlencoded",
                "Content-Length": contents.length
            }
        };
        let req = http.request(options, function (res) {
            console.log(""+i+"")
            res.on("data", function (chunk) {
                console.log(chunk.toString());
            });
            res.on("end", function (d) {
                resolve("end");
            });
            res.on("error", function (e) {
                reject(e);
            })
        });
        req.write(contents);
        req.end();
        contents = null;
        console.log(""+i+"")
    })
}

my question:

  
  npq**q < p**
  ***
  n***
  :()
{ 
 kind: "upload",
username: "moran999",
filepath: "F:/my_upload_test/NowTest.pdf",
file_id: "-196987878-472217752177633040957425519",
alreadychunks: [],
chunkSize: 1048576,
n: 9 }
0 JSON
0 JSON
0
0
1 JSON
1 JSON
1
1
2 JSON
2 JSON
2
2
3 JSON
3 JSON
3
3
5 JSON
5 JSON
5
5
4 JSON
4 JSON
4
4
6 JSON
6 JSON
6
6
8 JSON
8 JSON
8
8
7 JSON
7 JSON
7
7
8
moran999
4
moran999
6
moran999
1
moran999
2
moran999
0
moran999
3
moran999
7
moran999
5
moran999
[ "end", "end", "end", "end", "end", "end", "end", "end", "end" ]
all done
  POSTPOST
  
  ipost
  

  12 JSON
  postpost
  
  
  
Mar.07,2021

No, this is normal. You see the first post has sent out before the second block JSON starts , post sends and http.requset juxtaposes, JSON begins after reading the file, so the data is sent before the file is read, but the previous request has not yet been executed when reading the following block, so there is no the nth piece of data to return . The network delay is much larger than reading the file, so the file will not be returned until it is finished.

feels that this code will not overflow memory at 12.

it's better to use a pipe when uploading files on nodejs.


as @ zonxin says, the code is roughly the same as you think.

add why block 12 has already overflowed

  1. let chunkjson = JSON.stringify (chunk); convert the original 1m Buffer into an array-style string [104 code.] , and the memory increases by x (not to mention that there is a queryString.stringify ).
  2. nodejs memory is limited by V8 (1.4GB in 64-bit system is about 0.7GB in 32-bit system, except Buffer ), while the landlord just converts Buffer to string .
  3. does not use pipe (converted to string ), so that the sent bytes remain in memory until the complete string is sent, while the complete missing block has xroom1m .

  • Network latency must be greater than your local io rate, so it's normal for requests not to return before all file parts have been read
  • you use readable stream, here, but each time you create a new steram and read the large file in its entirety. I think on the basis of the first point, this is the main cause of overflow
  • if multipart upload is done in nodejs, it is better to use stream's pipeline to make a combination. The general idea is to first create a readable stream, and then pass the sharding and upload logic into the pipe. This is definitely better than using stream.on ('data'), because you can also combine with other stream objects, such as http's res, and so on.
  • Readable stream in
  • nodejs is itself a sharding reading technique, so just add extra transformation logic on top of it
Menu