


Python implements weather crawler collector in two simple steps
You may feel that crawlers are very mysterious, but in fact they are not as magical as we imagine (of course, the crawlers of Google and Baidu are complex and powerful. Its power is not the strength of the crawler itself, but the background data processing and data mining. The algorithm is very powerful), today we will unveil its mystery. Haha, you can implement a web weather crawler program in two simple steps. . .
Simply speaking, the crawler consists of two parts: 1. Obtain the text information of the web page. 2. Data analysis to obtain the data we want.
1. Obtain web page text information.
Python is very convenient for obtaining html. With the help of the urllib library, it only takes a few lines of code to achieve the functions we need.
#引入urllib库 import urllib def getHtml(url): page = urllib.urlopen(url) html = page.read() page.close() return html
What is returned here is the source code of the web page, which is the html code.
So how do we get the information we want from it? Then you need to use the most commonly used tool in web analysis - regular expressions.
2. Obtain the required content based on regular expressions, etc.
When using regular expressions, you need to carefully observe the structure of the web page information and write correct regular expressions.
The use of python regular expressions is also very simple:
#引入正则表达式库 import re def getWeather(html): reg = '<a title=.*?>(.*?)</a>.*?<span>(.*?)</span>.*?<b>(.*?)</b>' weatherList = re.compile(reg).findall(html) return weatherList
Instructions:
where reg is the regular expression and html is the text obtained in the first step. The function of findall is to find all strings in html that match regular matches and store them in weatherList. Then enumerate the data output in weathereList.
There are two things to note about the regular expression reg here.
One is "(.*?)". As long as the content in () is the content we will get, if there are multiple brackets, then each result of findall will contain the content in these brackets. There are three brackets above, corresponding to the city, the lowest temperature and the highest temperature.
The other one is ".*?". Python's regular matching is greedy by default, that is, it matches as many strings as possible by default. If you add a question mark at the end, it means non-greedy mode, that is, match as few strings as possible. Here, since there are multiple cities that need to be matched, the non-greedy mode needs to be used, otherwise there will only be one matching result left, and it will be incorrect.

To maximize the efficiency of learning Python in a limited time, you can use Python's datetime, time, and schedule modules. 1. The datetime module is used to record and plan learning time. 2. The time module helps to set study and rest time. 3. The schedule module automatically arranges weekly learning tasks.

Python excels in gaming and GUI development. 1) Game development uses Pygame, providing drawing, audio and other functions, which are suitable for creating 2D games. 2) GUI development can choose Tkinter or PyQt. Tkinter is simple and easy to use, PyQt has rich functions and is suitable for professional development.

Python is suitable for data science, web development and automation tasks, while C is suitable for system programming, game development and embedded systems. Python is known for its simplicity and powerful ecosystem, while C is known for its high performance and underlying control capabilities.

You can learn basic programming concepts and skills of Python within 2 hours. 1. Learn variables and data types, 2. Master control flow (conditional statements and loops), 3. Understand the definition and use of functions, 4. Quickly get started with Python programming through simple examples and code snippets.

Python is widely used in the fields of web development, data science, machine learning, automation and scripting. 1) In web development, Django and Flask frameworks simplify the development process. 2) In the fields of data science and machine learning, NumPy, Pandas, Scikit-learn and TensorFlow libraries provide strong support. 3) In terms of automation and scripting, Python is suitable for tasks such as automated testing and system management.

You can learn the basics of Python within two hours. 1. Learn variables and data types, 2. Master control structures such as if statements and loops, 3. Understand the definition and use of functions. These will help you start writing simple Python programs.

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...

How to avoid being detected when using FiddlerEverywhere for man-in-the-middle readings When you use FiddlerEverywhere...


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

Zend Studio 13.0.1
Powerful PHP integrated development environment

SublimeText3 English version
Recommended: Win version, supports code prompts!

SublimeText3 Chinese version
Chinese version, very easy to use

Dreamweaver Mac version
Visual web development tools