Home  >  Article  >  Backend Development  >  What are the crawler frameworks for python?

What are the crawler frameworks for python?

爱喝马黛茶的安东尼
爱喝马黛茶的安东尼Original
2019-06-12 15:38:358311browse

What are the crawler frameworks for python?

What are the crawler frameworks for python? Let me introduce to you the top ten frameworks of commonly used Python crawlers:

1. Scrapy

The Scrapy framework is a relatively mature set of Python The crawler framework is a fast, high-level information crawling framework developed using Python, which can efficiently crawl web pages and extract structured data.

Scrapy has a wide range of applications, including crawler development, data mining, data monitoring, automated testing, etc.

2. PySpider

is a powerful web crawler framework written by Chinese people in python. The main features are as follows:

1. Powerful WebUI, including: script editor, task monitor, project manager and result viewer;
2. Multi-database support, including: MySQL, MongoDB, Redis, SQLite, Elasticsearch; PostgreSQL with SQLAlchemy, etc.;
3. Use RabbitMQ, Beanstalk, Redis and Kombu as message queues;
4. Support task priority setting, scheduled tasks, retry after failure, etc. ;
5. Support distributed crawlers

3. Crawley

High-speed crawling of the content of the corresponding website, supports relational and non-relational databases, and the data can be exported as JSON, XML, etc.

Related recommendations: "python video tutorial"

4. Portia

Visual crawling Get web content

5.newspaper

Extract news, articles and content analysis
6.python-goose

## Article extraction tool written in #java


7. Beautiful Soup

is famous and integrates some common crawler requirements. Disadvantages: JS cannot be loaded.

Beautiful Soup is a Python library that can extract data from HTML or XML files. It can implement the usual ways of document navigation, search, and modification of documents through your favorite converter. Beautiful Soup will help you save money Hours or even days of work. I use this very frequently. Obtaining html elements is all done by bs4.

8. mechanize

Advantages: JS can be loaded. Cons: Documentation is severely lacking. However, through official examples and methods tried with human flesh, it is still barely usable.


9. Selenium

This is a driver that calls the browser. Through this library, you can directly call the browser to complete certain operations, such as entering a verification code.


10. cola

A distributed crawler framework. The overall design of the project is a bit bad, and the coupling between modules is high.


The above is the detailed content of What are the crawler frameworks for python?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn