Use node.js to crawl pictures on the website and save them locally

1. Description problem:
I made a small crawler, want to crawl some pictures of a website, now climb down, the image path can be printed out.
however, I need to download these pictures to a local images folder, write the code is no problem, I have tried, if the site crawled to the image path is normal, that is, the form of http://.img, then there is no problem to save.
however, the problem is that the image addresses on many websites are base64 . The code I write will always report errors, and I don"t know how to solve it.
2. Code map

clipboard.png
3.
clipboard.png

Mar.18,2021

for base64 images, you can write binary data to a file alone. Do not use stream


request(url, function (err, res, body) {
    if (!err && res.statusCode == 200) {
        var $ = cheerio.load(res.body.toString());
        $(".codelist").eq(0).children("a").each(function () {
            var listImgUrl = $(this).find("img").attr("src");
            links.push(listImgUrl)
        })
        
        async.mapSeries(links, function (item, callback) {
            let base64 = item.replace(/^data:image\/\w+;base64,/, ""); //base64data:image/png;base64
            let dataBuffer = new Buffer(base64, 'base64'); //base64buffer
            //download(item, dir, Math.floor(Math.random() * 100000) + item.substr(-4, 4));
            fs.writeFile('./images/'+Math.floor(Math.random() * 100000) + item.substr(-4, 4)+'.jpg',dataBuffer,function(err){
                if(err){
                    console.log(err);
                }else{
                    console.log('');
                }
            })
            callback(null, item)

        }, function (err, results) {});
    }

});

solved it on its own, using fs's writeFile method. Is the personal test effective?
Happy?

Menu