The problem of loading big data in web Visualization

A web application uses echarts to realize a curve or column chart at the front end, but if the amount of data is large, such as 300000 data, it is very slow to load directly from the database, and the network transmission time is not short.
ask how to improve speed, similar requirements, what is your solution?

Jul.15,2021

do you need to take out all the data at once?
loading on demand is good.
for example, according to the magnification of the curve,
data shows the granularity program, and the time range of
data such as daily, weekly and monthly

is taken as the time range of 1 week, N days and N months.

after the client gets the data, do the local cache to avoid querying

every time.

Uber has developed deck.gl

Previous: Baidu map api, clicks a certain point on the map, which attribute in e represents the province to which it belongs?

Next: Use the vant ui component library of the likes team < van-field >

Menu