How to collect back-end data for data analysis

Hello, everyone. Recently, the mall project run by the company needs to do data analysis, because the order data in the background (not just orders) are stored in mysql, so I want to collect data to mongodb or elasticsearch for data analysis. At present, there are two ways of implementation:
1, burying point, which does not cause pressure on the database, but needs to leave buried code in multiple locations of the project, which is troublesome in development and maintenance. there may be a dead corner in collecting data, and the data collected is not the data of user behavior, so there is no need to do so.
2. Read the database directly, because the read and write permissions of the company"s mysql database are in your own hands, as long as you visit the database and update the data to mongodb. But this method needs to take out and update all the data in the mysql database and visit it regularly. It increases the access pressure of the database in the production environment.
which solution is better, or is there a better one?
PS: learned it himself, and you can also obtain data through mysql"s binlog

Mar.22,2021

Let me talk about our practice. Our daily trading data volume is more than 2 million. The core does batch liquidation with very little trading volume in the second half of the night. This action is done only once a day, and the results are run out before going to work in the morning, and then manually reviewed by the liquidators. The database uses a genuine copy purchased by oracle10g,. Even if it is buried, from an operational point of view, it is necessary to write it into the log database, and then do a second analysis.

Menu