Python requests crawler monitoring tool?

Hello, folks, ask if there is a tool that can monitor the status of crawler processes, such as running, stopping, number of processes, data quality, log collection, etc.
preferably in the web interface.
my crawlers are written by requests+ multi-processes. I want to find a monitoring tool to monitor my crawlers.
if it is scrapy, it may use scrapyd to monitor the status. How to monitor requests.
Monitoring interface is shown in the following figure

Menu