search
HomeBackend DevelopmentPython TutorialWhich book should python crawler read?

As a convenient way to collect online information and extract usable information, web crawler technology is becoming more and more useful. Using a simple programming language like Python, you can crawl complex websites using a minimal amount of programming skills.

Which book should python crawler read?

"Writing Web Crawler in Python" is an excellent guide to using Python to crawl network data. It explains how to crawl data from static pages and use cache to manage servers. load method. Additionally, this book explains how to scrape data using AJAX URLs and Firebug extensions, as well as more facts about scraping techniques such as using browser rendering, managing cookies, and submitting forms from complex websites protected by CAPTCHAs. Extract data etc. This book uses Scrapy to create an advanced web crawler and crawl some real websites.

Related recommendations: "python Video Tutorial"

Which book should python crawler read?

"Writing a Web Crawler in Python" introduces the following content:

Crawl the website by following links;

Use lxml to extract data from the page;

Build a threaded crawler to crawl the page in parallel;

Cache downloaded content to reduce bandwidth consumption;

Parse websites that rely on JavaScript;

Interact with forms and sessions;

Resolve protected pages Verification code issues;

Reverse engineering of AJAX calls;

Use Scrapy to create advanced crawlers.

This book is intended for readers

This book is written for developers who want to build reliable data crawling solutions. This book assumes that readers have some knowledge of Python. Programming experience. Of course, readers with experience in developing other programming languages ​​can also read this book and understand the concepts and principles involved in it.

About the author · · · · · · ·

Richard Lawson is from Australia and graduated from the University of Melbourne with a major in computer science. After graduation, he founded a company specializing in web crawling, providing remote work to businesses in over 50 countries. He is proficient in Esperanto, can converse in Chinese and Korean, and is actively involved in open source software. He is currently studying for a postgraduate degree at the University of Oxford and spends his spare time developing autonomous drones.

Table of Contents · · · · · · ·

Table of Contents

Chapter 1 Introduction to Web Crawler 1

1.1 When are web crawlers useful1

1.2 Are web crawlers legal2

1.3 Background research3

1.3.1 Check robots.txt 3

1.3.2 Check the site map4

1.3.3 Estimate the size of the website5

1.3.4 Identify the technology used by the website7

1.3.5 Find the website owner7

1.4 Writing the first web crawler 8

1.4.1 Downloading web pages 9

1.4.2 Site map crawler 12

1.4.3 ID traversal crawler 13

1.4.4 Link Crawler 15

1.5 Summary of this Chapter 22

Chapter 2 Data Capture

2.1 Analysis of Web Pages 23

2.2 Three web crawling methods 26

2.2.1 Regular expression 26

2.2.2 Beautiful Soup 28

2.2.3 Lxml 30

2.2.4 Performance comparison 32

2.2.5 Conclusion 35

2.2.6 Add crawl callback for link crawler 35

2.3 Summary of this chapter 38

Chapter 3 Download Caching 39

3.1 Add caching support for link crawler 39

3.2 Disk caching 42

3.2. 1. Implementation 44

3.2.2 Cache test 46

3.2.3 Save disk space 46

3.2.4 Clean up expired data 47

3.2.5 Disadvantages 48

3.3 Database caching 49

3.3.1 What is NoSQL 50

3.3.2 Installing MongoDB 50

3.3.3 Overview of MongoDB 50

3.3.4 MongoDB cache implementation 52

3.3.5 Compression 54

3.3.6 Cache test 54

3.4 Summary of this chapter 55

Chapter 4 Concurrent Download 57

4.1 1 million web pages 57

4.2 Serial crawler 60

4.3 Multi-threaded crawler 60

4.3.1 How threads and processes work 61

4.3.2 Implementation 61

4.3.3 Multi-process crawler 63

4.4 Performance 67

4.5 Summary of this chapter 68

Chapter 5 Dynamic Content 69

5.1 Example of dynamic web page 69

5.2 Reverse engineering of dynamic web page 72

5.3 Rendering dynamic web pages 77

5.3.1 PyQt or PySide 78

5.3.2 Executing JavaScript 78

5.3.3 Using WebKit to interact with the website 80

5.3.4 Selenium 85

5.4 Summary of this chapter 88

Chapter 6 Form interaction 89

6.1 Login form 90

6.2 Login script extension that supports content update 97

6.3 Use the Mechanize module to implement automated form processing 100

6.4 Summary of this chapter 102

Chapter 7 Verification code processing 103

7.1 Registering an account 103

7.2 Optical character recognition 106

7.3 Processing complex verification codes 111

7.3.1 Use Verification code processing service 112

7.3.2 | 9kw entry 112

7.3.3 Integration with registration function 119

7.4 Summary of this chapter 120

Chapter 8 Scrapy 121

8.1 Installation 121

8.2 Start the project 122

8.2.1 Define the model 123

8.2.2 Create a crawler 124

8.2.3 Use shell command to crawl 128

8.2.4 Check results 129

8.2.5 Interrupt and resume crawlers 132

8.3 Use Portia to write visual crawlers 133

8.3.1 Installation 133

8.3.2 Marking 136

8.3.3 Optimizing crawlers 138

8.3.4 Checking results 140

8.4 Using Scrapely to achieve automated crawling 141

8.5 Summary of this chapter 142

Summary of Chapter 9 143

9.1 Google search engine 143

9.2 Facebook 148

9.2. 1 Website 148

9.2.2 API 150

9.3 Gap 151

9.4 BMW 153

9.5 Summary of this chapter 157

The above is the detailed content of Which book should python crawler read?. For more information, please follow other related articles on the PHP Chinese website!

Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Python vs. C  : Learning Curves and Ease of UsePython vs. C : Learning Curves and Ease of UseApr 19, 2025 am 12:20 AM

Python is easier to learn and use, while C is more powerful but complex. 1. Python syntax is concise and suitable for beginners. Dynamic typing and automatic memory management make it easy to use, but may cause runtime errors. 2.C provides low-level control and advanced features, suitable for high-performance applications, but has a high learning threshold and requires manual memory and type safety management.

Python vs. C  : Memory Management and ControlPython vs. C : Memory Management and ControlApr 19, 2025 am 12:17 AM

Python and C have significant differences in memory management and control. 1. Python uses automatic memory management, based on reference counting and garbage collection, simplifying the work of programmers. 2.C requires manual management of memory, providing more control but increasing complexity and error risk. Which language to choose should be based on project requirements and team technology stack.

Python for Scientific Computing: A Detailed LookPython for Scientific Computing: A Detailed LookApr 19, 2025 am 12:15 AM

Python's applications in scientific computing include data analysis, machine learning, numerical simulation and visualization. 1.Numpy provides efficient multi-dimensional arrays and mathematical functions. 2. SciPy extends Numpy functionality and provides optimization and linear algebra tools. 3. Pandas is used for data processing and analysis. 4.Matplotlib is used to generate various graphs and visual results.

Python and C  : Finding the Right ToolPython and C : Finding the Right ToolApr 19, 2025 am 12:04 AM

Whether to choose Python or C depends on project requirements: 1) Python is suitable for rapid development, data science, and scripting because of its concise syntax and rich libraries; 2) C is suitable for scenarios that require high performance and underlying control, such as system programming and game development, because of its compilation and manual memory management.

Python for Data Science and Machine LearningPython for Data Science and Machine LearningApr 19, 2025 am 12:02 AM

Python is widely used in data science and machine learning, mainly relying on its simplicity and a powerful library ecosystem. 1) Pandas is used for data processing and analysis, 2) Numpy provides efficient numerical calculations, and 3) Scikit-learn is used for machine learning model construction and optimization, these libraries make Python an ideal tool for data science and machine learning.

Learning Python: Is 2 Hours of Daily Study Sufficient?Learning Python: Is 2 Hours of Daily Study Sufficient?Apr 18, 2025 am 12:22 AM

Is it enough to learn Python for two hours a day? It depends on your goals and learning methods. 1) Develop a clear learning plan, 2) Select appropriate learning resources and methods, 3) Practice and review and consolidate hands-on practice and review and consolidate, and you can gradually master the basic knowledge and advanced functions of Python during this period.

Python for Web Development: Key ApplicationsPython for Web Development: Key ApplicationsApr 18, 2025 am 12:20 AM

Key applications of Python in web development include the use of Django and Flask frameworks, API development, data analysis and visualization, machine learning and AI, and performance optimization. 1. Django and Flask framework: Django is suitable for rapid development of complex applications, and Flask is suitable for small or highly customized projects. 2. API development: Use Flask or DjangoRESTFramework to build RESTfulAPI. 3. Data analysis and visualization: Use Python to process data and display it through the web interface. 4. Machine Learning and AI: Python is used to build intelligent web applications. 5. Performance optimization: optimized through asynchronous programming, caching and code

Python vs. C  : Exploring Performance and EfficiencyPython vs. C : Exploring Performance and EfficiencyApr 18, 2025 am 12:20 AM

Python is better than C in development efficiency, but C is higher in execution performance. 1. Python's concise syntax and rich libraries improve development efficiency. 2.C's compilation-type characteristics and hardware control improve execution performance. When making a choice, you need to weigh the development speed and execution efficiency based on project needs.

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

Atom editor mac version download

Atom editor mac version download

The most popular open source editor

SublimeText3 Linux new version

SublimeText3 Linux new version

SublimeText3 Linux latest version

SublimeText3 Mac version

SublimeText3 Mac version

God-level code editing software (SublimeText3)

SublimeText3 English version

SublimeText3 English version

Recommended: Win version, supports code prompts!

SAP NetWeaver Server Adapter for Eclipse

SAP NetWeaver Server Adapter for Eclipse

Integrate Eclipse with SAP NetWeaver application server.