About bulk importing data into the database

I would like to ask you how we usually do bulk data import into the database. I now have a small amount of data, and I feel that the time for importing it one by one is acceptable, but when I think about a large amount of data, this is not very good. So I"d like to see that everyone usually deals with this kind of problem

.
Php
Mar.15,2021

MYSQL has source import if adding sql one by one to the database will take a long time to use source efficiency will improve a lot of 150W data import in less than 1 minute, efficiency gap


postgre has copy, direct batch import. Database backups can export binaries using pg_dump, and then import pg_restore into other libraries in bulk.
assume that pg_dump is exported as a single sql, and the batch import time is 30 minutes. Export to binary, batch import time is less than 2 minutes, efficiency gap


Sorry, my problem may not be described in a specific point, I want to ask you through php code, how to efficiently import data into the database


I currently use laravel words are directly put in the queue, so at least to ensure that there will not be a timeout.


after the file is uploaded, create a background script to process it asynchronously

Menu