


#With the rapid development of the Internet, more and more data are flooding this era. Obtaining and processing data has become an essential part of our lives, and crawlers have emerged as the times require.
Many languages can crawl, but the crawler based on python is more concise and convenient. Crawler has also become an indispensable part of the python language. So what kind of data can we obtain through crawlers? What kind of analysis method is there?
In the previous article, I introduced to you an introduction to the basic process of Request and Response,What this article brings to you is what kind of data the crawler can obtain and its specific analysis method.
What kind of data can be captured?
Web page text: such as HTML documents, Json format text loaded by Ajax, etc.;
Pictures, videos, etc.: The binary files obtained are saved as pictures or videos. Format;
Anything else that can be requested can be obtained.
Demo
import requests headers = {'User-Agent':'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/55.0.2883.87 Safari/537.36'} resp = requests.get('http://www.baidu.com/img/baidu_jgylogo3.gif',headers=headers) print(resp.content) # 二进制文件使用content # 保存图片 with open('logo.gif','wb') as f: f.write(resp.content) print('Ok')
After successful operation, you can see the binary data of the printed image, and you can save the printed OK after success. , at this time we can see the downloaded pictures when we open the folder. These few lines of code simply demonstrate the process of the crawler saving files.
What are the parsing methods?
Direct processing, such as simple page documents, just remove some space data;
Json parsing and processing Ajax loaded page;
regular expression;
BeautifulSoup library;
PyQuery;
XPath.
##Summary
See this, Do you already have a clear understanding of the basic working principles of crawlers? Of course, Rome was not built in a day. As long as you accumulate enough experience, you will definitely become a reptile master. I believe that everyone will succeed after reading the relevant information I shared.
The above is the detailed content of What kind of data can the crawler obtain and the specific analysis method?. For more information, please follow other related articles on the PHP Chinese website!

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于Seaborn的相关问题,包括了数据可视化处理的散点图、折线图、条形图等等内容,下面一起来看一下,希望对大家有帮助。

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于进程池与进程锁的相关问题,包括进程池的创建模块,进程池函数等等内容,下面一起来看一下,希望对大家有帮助。

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于简历筛选的相关问题,包括了定义 ReadDoc 类用以读取 word 文件以及定义 search_word 函数用以筛选的相关内容,下面一起来看一下,希望对大家有帮助。

VS Code的确是一款非常热门、有强大用户基础的一款开发工具。本文给大家介绍一下10款高效、好用的插件,能够让原本单薄的VS Code如虎添翼,开发效率顿时提升到一个新的阶段。

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于数据类型之字符串、数字的相关问题,下面一起来看一下,希望对大家有帮助。

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于numpy模块的相关问题,Numpy是Numerical Python extensions的缩写,字面意思是Python数值计算扩展,下面一起来看一下,希望对大家有帮助。

pythn的中文意思是巨蟒、蟒蛇。1989年圣诞节期间,Guido van Rossum在家闲的没事干,为了跟朋友庆祝圣诞节,决定发明一种全新的脚本语言。他很喜欢一个肥皂剧叫Monty Python,所以便把这门语言叫做python。


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Zend Studio 13.0.1
Powerful PHP integrated development environment

Notepad++7.3.1
Easy-to-use and free code editor

Atom editor mac version download
The most popular open source editor

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.
