1. The goal of this article
Get the Ajax request and parse the required fields in JSON
Save the data to Excel
Save the data to MySQL for easy analysis
2. Analysis results
1. Introduction of the library
Average salary levels of Python positions in five cities
2. Page structure
We enter the query The condition is Python as an example. Other conditions are not selected by default. Click Query to see all Python positions. Then we open the console and click the Network tab to see the following request:
Judging from the response results, this request is exactly what we need. We can just request this address directly later. As can be seen from the picture, the following result is the information of each position.
Here we know where to request data and where to get the results. But there are only 15 pieces of data on the first page in the result list. How to get the data on other pages?
3. Request parameters
We click on the parameters tab, as follows:
We found that three form data were submitted. It is obvious that kd is the keyword we searched for. pn is the current page number. Just default to first, don't worry about it. All that's left is to construct a request to download 30 pages of data.
4. Constructing requests and parsing data
Constructing requests is very simple, we still use the requests library to do it. First, we construct the form data
data = {'first': 'true', 'pn': page, 'kd': lang_name}
and then use requests to request the url address. The parsed JSON data is done. Since Lagou has strict restrictions on crawlers, we need to add all the headers fields in the browser and increase the crawler interval. I set it to 10-20s later, and then the data can be obtained normally.
import requests def get_json(url, page, lang_name): headers = { 'Host': 'www.lagou.com', 'Connection': 'keep-alive', 'Content-Length': '23', 'Origin': 'https://www.lagou.com', 'X-Anit-Forge-Code': '0', 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0', 'Content-Type': 'application/x-www-form-urlencoded; charset=UTF-8', 'Accept': 'application/json, text/javascript, */*; q=0.01', 'X-Requested-With': 'XMLHttpRequest', 'X-Anit-Forge-Token': 'None', 'Referer': 'https://www.lagou.com/jobs/list_python?city=%E5%85%A8%E5%9B%BD&cl=false&fromSearch=true&labelWords=&suginput=', 'Accept-Encoding': 'gzip, deflate, br', 'Accept-Language': 'en-US,en;q=0.9,zh-CN;q=0.8,zh;q=0.7' } data = {'first': 'false', 'pn': page, 'kd': lang_name} json = requests.post(url, data, headers=headers).json() list_con = json['content']['positionResult']['result'] info_list = [] for i in list_con: info = [] info.append(i.get('companyShortName', '无')) info.append(i.get('companyFullName', '无')) info.append(i.get('industryField', '无')) info.append(i.get('companySize', '无')) info.append(i.get('salary', '无')) info.append(i.get('city', '无')) info.append(i.get('education', '无')) info_list.append(info) return info_list
4. Get all data
Now that we understand how to parse the data, the only thing left is to request all pages continuously. We construct a function to request all 30 pages of data.
def main(): lang_name = 'python' wb = Workbook() conn = get_conn() for i in ['北京', '上海', '广州', '深圳', '杭州']: page = 1 ws1 = wb.active ws1.title = lang_name url = 'https://www.lagou.com/jobs/positionAjax.json?city={}&needAddtionalResult=false'.format(i) while page < 31: info = get_json(url, page, lang_name) page += 1 import time a = random.randint(10, 20) time.sleep(a) for row in info: insert(conn, tuple(row)) ws1.append(row) conn.close() wb.save('{}职位信息.xlsx'.format(lang_name)) if __name__ == '__main__': main()
The above is the detailed content of How to use Python to implement job analysis reports. For more information, please follow other related articles on the PHP Chinese website!

Article discusses impossibility of tuple comprehension in Python due to syntax ambiguity. Alternatives like using tuple() with generator expressions are suggested for creating tuples efficiently.(159 characters)

The article explains modules and packages in Python, their differences, and usage. Modules are single files, while packages are directories with an __init__.py file, organizing related modules hierarchically.

Article discusses docstrings in Python, their usage, and benefits. Main issue: importance of docstrings for code documentation and accessibility.

Article discusses lambda functions, their differences from regular functions, and their utility in programming scenarios. Not all languages support them.

Article discusses break, continue, and pass in Python, explaining their roles in controlling loop execution and program flow.

The article discusses the 'pass' statement in Python, a null operation used as a placeholder in code structures like functions and classes, allowing for future implementation without syntax errors.

Article discusses passing functions as arguments in Python, highlighting benefits like modularity and use cases such as sorting and decorators.

Article discusses / and // operators in Python: / for true division, // for floor division. Main issue is understanding their differences and use cases.Character count: 158


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

MantisBT
Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function

SublimeText3 Chinese version
Chinese version, very easy to use

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment

SecLists
SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.
