A problem of query Optimization

I have a table exchange

CREATE TABLE `exchange` (
  `id` int(11) NOT NULL AUTO_INCREMENT,
  `rank` int(11) NOT NULL,
  `name` varchar(255) NOT NULL,
  `volume` varchar(255) NOT NULL,
  `timestamp` datetime NOT NULL,
  PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=75032 DEFAULT CHARSET=utf8;

every 5 minutes, a record of data is recorded, about 209 items at a time, which is the trading volume of 209 digital currency exchanges at the current time.

I expect to find data like this:

rank name [volume (latest 144values)] timestamp

the stupid way I take now is to look it up one by one,
get the list of exchanges
one by one
select volume from exchange where exchange.name="binance" order by timestamp DESC limit 144s

performance overhead is very high, would like to ask, is there a better way?

Mar.21,2021

I have an idea. I don't know if it's feasible. The structure of exchange remains unchanged, but create a new table

CREATE TABLE `exchange_statistics` (
  `id` int(11) NOT NULL AUTO_INCREMENT,
  `name` varchar(255) NOT NULL,
  `volume_statistics` text NOT NULL,
  PRIMARY KEY (`id`)
) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8;

where the volume_statistics field records the latest 144values after array serialization or symbol segmentation, which can be sorted out from small to large or sorted in time, and then the first element is removed every 5 minutes, and the latest element is added after it.

so that each exchange can preprocess the latest 144values, get the exchange list and directly query exchange_statistics .

exchange structure does not need to be changed to prevent future business changes. Statistics are involved.

I don't know what to do with this scheme.


the amount of data does not exceed 2 million. The query speed of yours is about 300ms. If caching is enabled, it will be even faster. If you have more than 2 million data, the speed of timestamp indexing is also very considerable. I don't know how much data you have

.
Menu