


This article mainly introduces the relevant information about the simple implementation of Python crawlerPicture. Friends in need can refer to
Simple implementation of Python crawler picture
I often browse Zhihu, and sometimes I want to save pictures of some questions together. Hence this program. This is a very simple image crawler program that can only crawl the part of the image that has been brushed out. Since I am not familiar with this part of the content, I will just say a few words and record the code without explaining too much. If you are interested, you can use it directly. Personal testing is available on websites such as Zhihu.
The previous article shared how to open images through URLs. The purpose is to first see what the crawled images look like, and then filter and save them.
The requests library is used here to obtain page information. It should be noted that when obtaining page information, a header is needed to disguise the program as a browser to access the server, otherwise May be rejected by the server. Then use BeautifulSoup to filter excess information to get the image address. After getting the picture, filter out some small pictures such as avatars and emoticons based on the size of the picture. Finally, you have more choices when opening or saving images, including OpenCV, skimage, PIL, etc.
The procedure is as follows:
# -*- coding=utf-8 -*- import requests as req from bs4 import BeautifulSoup from PIL import Image from io import BytesIO import os from skimage import io url = "https://www.zhihu.com/question/37787176" headers = {'User-Agent' : 'Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.96 Mobile Safari/537.36'} response = req.get(url,headers=headers) content = str(response.content) #print content soup = BeautifulSoup(content,'lxml') images = soup.find_all('img') print u"共有%d张图片" % len(images) if not os.path.exists("images"): os.mkdir("images") for i in range(len(images)): img = images[i] print u"正在处理第%d张图片..." % (i+1) img_src = img.get('src') if img_src.startswith("http"): ## use PIL ''' print img_src response = req.get(img_src,headers=headers) image = Image.open(BytesIO(response.content)) w,h = image.size print w,h img_path = "images/" + str(i+1) + ".jpg" if w>=500 and h>500: #image.show() image.save(img_path) ''' ## use OpenCV import numpy as np import urllib import cv2 resp = urllib.urlopen(img_src) image = np.asarray(bytearray(resp.read()), dtype="uint8") image = cv2.imdecode(image, cv2.IMREAD_COLOR) w,h = image.shape[:2] print w,h img_path = "images/" + str(i+1) + ".jpg" if w>=400 and h>400: cv2.imshow("Image", image) cv2.waitKey(3000) ##cv2.imwrite(img_path,image) ## use skimage ## image = io.imread(img_src) ## w,h = image.shape[:2] ## print w,h #io.imshow(image) #io.show() ## img_path = "images/" + str(i+1) + ".jpg" ## if w>=500 and h>500: ## image.show() ## image.save(img_path) ## io.imsave(img_path,image) print u"处理完成!"
The above is the detailed content of Simple example analysis of how to implement crawler images in Python. For more information, please follow other related articles on the PHP Chinese website!

Python excels in automation, scripting, and task management. 1) Automation: File backup is realized through standard libraries such as os and shutil. 2) Script writing: Use the psutil library to monitor system resources. 3) Task management: Use the schedule library to schedule tasks. Python's ease of use and rich library support makes it the preferred tool in these areas.

To maximize the efficiency of learning Python in a limited time, you can use Python's datetime, time, and schedule modules. 1. The datetime module is used to record and plan learning time. 2. The time module helps to set study and rest time. 3. The schedule module automatically arranges weekly learning tasks.

Python excels in gaming and GUI development. 1) Game development uses Pygame, providing drawing, audio and other functions, which are suitable for creating 2D games. 2) GUI development can choose Tkinter or PyQt. Tkinter is simple and easy to use, PyQt has rich functions and is suitable for professional development.

Python is suitable for data science, web development and automation tasks, while C is suitable for system programming, game development and embedded systems. Python is known for its simplicity and powerful ecosystem, while C is known for its high performance and underlying control capabilities.

You can learn basic programming concepts and skills of Python within 2 hours. 1. Learn variables and data types, 2. Master control flow (conditional statements and loops), 3. Understand the definition and use of functions, 4. Quickly get started with Python programming through simple examples and code snippets.

Python is widely used in the fields of web development, data science, machine learning, automation and scripting. 1) In web development, Django and Flask frameworks simplify the development process. 2) In the fields of data science and machine learning, NumPy, Pandas, Scikit-learn and TensorFlow libraries provide strong support. 3) In terms of automation and scripting, Python is suitable for tasks such as automated testing and system management.

You can learn the basics of Python within two hours. 1. Learn variables and data types, 2. Master control structures such as if statements and loops, 3. Understand the definition and use of functions. These will help you start writing simple Python programs.

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Dreamweaver Mac version
Visual web development tools

DVWA
Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment

SublimeText3 English version
Recommended: Win version, supports code prompts!