How to write seven different websites in scrapy in python, the rules are not the same.

how do you write 7 different websites in scrapy in python, and the rules are different
set in setting?

Mar.20,2021

https://blog.csdn.net/Q_AN131.
wonder if that's what you mean


1. When writing crawler rules, avoid using parse as a callback function. Because CrawlSpider uses the parse method to implement its logic, if you override the parse method, crawl spider, it will fail to run
2. First call parse_item, to extract the page content with xpath, and then extract the page rules with Rule, where you extract the 2.shtml
3.setting settings: when you use Scrapy, you must tell it which settings you use. You can do this by using environment variables. The SCRAPY_SETTINGS_ module value script _ SETTINGS_MODULE should be in the Python path syntax, such as myproject.settings. Note that the setup module should be on the Python import search path.

Menu