Write an API with php and import the files in txt into the database.

now the txt file format is as follows
1 2018-02-24 08:38:37
2 2018-02-24 08:38:39
3 2018-02-24 08:38:30

there are about 30W pieces of data, so let"s not cite one by one

.

now I use the explode function to split the data through spaces and insert it into the database one by one
but this division will divide the later time into two pieces, so I would like to ask if there is any solution. In addition, must import data through API , is there a better way, because I insert item by item is too inefficient

Php
Mar.02,2021

1. explode the third parameter can specify the number of cuts. When the number is reached, the subsequent cuts will not be cut

.
$s= 'aaa 2018-02-24 08:38:37';
print_r(explode(' ', $s, 2));

2. insert into xx (id) values (1), (2), (3) you can insert 3 strips at once, which is much better than executing 3 sql, but be aware that if you can't spell a very long sql, it will exceed max_allowed_packet

.

it is not fast to import data directly in PHP with insert. It is recommended that TXT be converted into a CSV file. Using the load data infile command of MySQL, the execution speed will be much faster than that of insert.
if you use $pdo- > exec ('load data.') It's still too slow, you can write a shell script and call it with exec in PHP, and the process is executed asynchronously, which is instantaneous from the point of view of the program that calls API.

Menu