search
HomeBackend DevelopmentPython TutorialIntroduction to the basic writing method of Python web crawler function

This article mainly introduces the basic writing method of Python web crawler function. Web crawler, namely Web Spider, is a very vivid name. Comparing the Internet to a spider web, then Spider is a spider crawling around on the web. Friends who are interested in web crawlers can refer to this article

Web crawlers, namely Web Spider, are A very vivid name. If the Internet is compared to a spider web, then a spider is a spider crawling around on the web.

1. The definition of web crawler

Web spiders search for web pages through the link addresses of web pages. Starting from a certain page of the website (usually the home page), read the content of the web page, find other link addresses in the web page, and then find the next web page through these link addresses, and continue loop until the All pages of this website have been crawled. If the entire Internet is regarded as a website, then web spiders can use this principle to crawl all web pages on the Internet. In this way, a web crawler is a crawler, a program that crawls web pages. The basic operation of a web crawler is to crawl web pages.

2. The process of browsing the webpage

The process of crawling the webpage is actually the same as the way readers usually use IE browserto browse the webpage . For example, you enter the address www.baidu.com in the address bar of the browser.

The process of opening a web page is actually that the browser, as a browsing "client", sends a request to the server, "grabs" the server-side files locally, and then interprets and displays them.

HTML is a markup language that uses tags to mark content and parse and differentiate it. The function of the browser is to parse the obtained HTML code, and then convert the original code into the website page we see directly.

3. Web crawler function based on python

1). Get html page with python

In fact, the most basic website grabbing is just two sentences:


import urllib2
content = urllib2.urlopen('http://XXXX').read()

In this way, you can get the entire html document. The key issue is that we may need to start from this document. Get the useful information we need instead of the entire document. This requires parsing html filled with various tags.

2). How to parse html after python crawler crawls the page

python crawler html parsing library SGMLParser

Python comes with parsers such as HTMLParser and SGMLParser by default. The former is really difficult to use, so I wrote a sample program using SGMLParser:


import urllib2
from sgmllib import SGMLParser
 
class ListName(SGMLParser):
def init(self):
SGMLParser.init(self)
self.is_h4 = ""
self.name = []
def start_h4(self, attrs):
self.is_h4 = 1
def end_h4(self):
self.is_h4 = ""
def handle_data(self, text):
if self.is_h4 == 1:
self.name.append(text)
 
content = urllib2.urlopen('http://169it.com/xxx.htm').read()
listname = ListName()
listname.feed(content)
for item in listname.name:
print item.decode('gbk').encode('utf8')

It's very simple. A class called ListName is defined here, Inherits the methods in SGMLParser. Use a variable is_h4 as a mark to determine the h4 tag in the html file. If an h4 tag is encountered, the content in the tag is added to the List variable name. Explain start_h4() and end_h4() function, their prototype is


##

start_tagname(self, attrs)
end_tagname(self)

tagname in SGMLParser. Tagname is the tag name, for example, when encountering , start_pre will be called, and end_pre will be called when is encountered. attrs is the parameter of the label, returned in the form of [(attribute, value), (attribute, value), ...].

python crawler html parsing library pyQuery

pyQuery is the implementation of

jQuery in python, and can use jQuery syntax It is very convenient to operate and parse HTML documents. You need to install it before use, easy_install pyquery, or under Ubuntu


sudo apt-get install python-pyquery

The following example:


from pyquery import PyQuery as pyq
doc=pyq(url=r'http://169it.com/xxx.html')
cts=doc('.market-cat')
 
for i in cts:
print '====',pyq(i).find('h4').text() ,'===='
for j in pyq(i).find('.sub'):
print pyq(j).text() ,
print '\n'

Python crawler html parsing library BeautifulSoup

One of the headaches is that most web pages are not written in full compliance with standards, and there are all kinds of inexplicable errors that make people want to Find the person who wrote the web page and beat him up. In order to solve this problem, we can choose the famous BeautifulSoup to parse HTML documents, which has good fault tolerance.

The above is the entire content of this article. It provides a detailed analysis and introduction to the implementation of the Python web crawler function. I hope it will be helpful to everyone's learning.

The above is the detailed content of Introduction to the basic writing method of Python web crawler function. For more information, please follow other related articles on the PHP Chinese website!

Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
How are arrays used in scientific computing with Python?How are arrays used in scientific computing with Python?Apr 25, 2025 am 12:28 AM

ArraysinPython,especiallyviaNumPy,arecrucialinscientificcomputingfortheirefficiencyandversatility.1)Theyareusedfornumericaloperations,dataanalysis,andmachinelearning.2)NumPy'simplementationinCensuresfasteroperationsthanPythonlists.3)Arraysenablequick

How do you handle different Python versions on the same system?How do you handle different Python versions on the same system?Apr 25, 2025 am 12:24 AM

You can manage different Python versions by using pyenv, venv and Anaconda. 1) Use pyenv to manage multiple Python versions: install pyenv, set global and local versions. 2) Use venv to create a virtual environment to isolate project dependencies. 3) Use Anaconda to manage Python versions in your data science project. 4) Keep the system Python for system-level tasks. Through these tools and strategies, you can effectively manage different versions of Python to ensure the smooth running of the project.

What are some advantages of using NumPy arrays over standard Python arrays?What are some advantages of using NumPy arrays over standard Python arrays?Apr 25, 2025 am 12:21 AM

NumPyarrayshaveseveraladvantagesoverstandardPythonarrays:1)TheyaremuchfasterduetoC-basedimplementation,2)Theyaremorememory-efficient,especiallywithlargedatasets,and3)Theyofferoptimized,vectorizedfunctionsformathematicalandstatisticaloperations,making

How does the homogenous nature of arrays affect performance?How does the homogenous nature of arrays affect performance?Apr 25, 2025 am 12:13 AM

The impact of homogeneity of arrays on performance is dual: 1) Homogeneity allows the compiler to optimize memory access and improve performance; 2) but limits type diversity, which may lead to inefficiency. In short, choosing the right data structure is crucial.

What are some best practices for writing executable Python scripts?What are some best practices for writing executable Python scripts?Apr 25, 2025 am 12:11 AM

TocraftexecutablePythonscripts,followthesebestpractices:1)Addashebangline(#!/usr/bin/envpython3)tomakethescriptexecutable.2)Setpermissionswithchmod xyour_script.py.3)Organizewithacleardocstringanduseifname=="__main__":formainfunctionality.4

How do NumPy arrays differ from the arrays created using the array module?How do NumPy arrays differ from the arrays created using the array module?Apr 24, 2025 pm 03:53 PM

NumPyarraysarebetterfornumericaloperationsandmulti-dimensionaldata,whilethearraymoduleissuitableforbasic,memory-efficientarrays.1)NumPyexcelsinperformanceandfunctionalityforlargedatasetsandcomplexoperations.2)Thearraymoduleismorememory-efficientandfa

How does the use of NumPy arrays compare to using the array module arrays in Python?How does the use of NumPy arrays compare to using the array module arrays in Python?Apr 24, 2025 pm 03:49 PM

NumPyarraysarebetterforheavynumericalcomputing,whilethearraymoduleismoresuitableformemory-constrainedprojectswithsimpledatatypes.1)NumPyarraysofferversatilityandperformanceforlargedatasetsandcomplexoperations.2)Thearraymoduleislightweightandmemory-ef

How does the ctypes module relate to arrays in Python?How does the ctypes module relate to arrays in Python?Apr 24, 2025 pm 03:45 PM

ctypesallowscreatingandmanipulatingC-stylearraysinPython.1)UsectypestointerfacewithClibrariesforperformance.2)CreateC-stylearraysfornumericalcomputations.3)PassarraystoCfunctionsforefficientoperations.However,becautiousofmemorymanagement,performanceo

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

PhpStorm Mac version

PhpStorm Mac version

The latest (2018.2.1) professional PHP integrated development tool

Atom editor mac version download

Atom editor mac version download

The most popular open source editor

WebStorm Mac version

WebStorm Mac version

Useful JavaScript development tools

SecLists

SecLists

SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.

EditPlus Chinese cracked version

EditPlus Chinese cracked version

Small size, syntax highlighting, does not support code prompt function