高洛峰2017-04-18 10:16:55
You can use the scrapy framework to implement the crawler. You can try 3 times if the crawl fails. If it still fails, you can customize it and write it into the log, and you can deal with it later by yourself
PHPz2017-04-18 10:16:55
requests.get has a timeout parameter.
a = requests.get("http://www.baidu.com",timeout = 500)