Some websites will have corresponding anti-crawler measures. For example, many websites will detect the number of visits to a certain IP in a certain period of time. If the frequency of visits is too fast and does not look like a normal visitor, it may ban the IP. Access. So we need to set up some proxy servers and change the proxy every once in a while. Even if the IP is banned, you can still change the IP and continue crawling.
In Python, you can use the ProxyHandler in urllib2 to set up a proxy server. The following code explains how to use the proxy:
import urllib2 # 构建了两个代理Handler,一个有代理IP,一个没有代理IP httpproxy_handler = urllib2.ProxyHandler({"http" : "124.88.67.81:80"}) nullproxy_handler = urllib2.ProxyHandler({}) #定义一个代理开关 proxySwitch = True # 通过 urllib2.build_opener()方法使用这些代理Handler对象,创建自定义opener对象 # 根据代理开关是否打开,使用不同的代理模式 if proxySwitch: opener = urllib2.build_opener(httpproxy_handler) else: opener = urllib2.build_opener(nullproxy_handler) request = urllib2.Request("http://www.baidu.com/") # 使用opener.open()方法发送请求才使用自定义的代理,而urlopen()则不使用自定义代理。 response = opener.open(request) # 就是将opener应用到全局,之后所有的,不管是opener.open()还是urlopen() 发送请求,都将使用自定义代理。 # urllib2.install_opener(opener) # response = urlopen(request) print response.read()
Used above It is a free open proxy. We can collect these free proxies on some proxy websites. If it can be used after testing, we will collect it and use it on the crawler.
Related recommendations: "Python Video Tutorial"
Free agent website:
西thornfree agent
Fast proxy free proxy
National proxy ip
If you have enough proxies, you can put them in a list and randomly select a proxy to access the website. As follows:
import urllib2 import random proxy_list = [ {"http" : "124.88.67.81:80"}, {"http" : "124.88.67.81:80"}, {"http" : "124.88.67.81:80"}, {"http" : "124.88.67.81:80"}, {"http" : "124.88.67.81:80"} ] # 随机选择一个代理 proxy = random.choice(proxy_list) # 使用选择的代理构建代理处理器对象 httpproxy_handler = urllib2.ProxyHandler(proxy) opener = urllib2.build_opener(httpproxy_handler) request = urllib2.Request("http://www.baidu.com/") response = opener.open(request) print response.read()
The above are all free proxies, which are not very stable and often cannot be used. At this time, you can consider using a private proxy. That is to say, purchase an agent from an agent supplier. The supplier will provide a valid agent with its own username and password. The specific use is the same as that of a free agent. This is an additional account authentication, as follows:
# 构建具有一个私密代理IP的Handler,其中user为账户,passwd为密码 httpproxy_handler = urllib2.ProxyHandler({"http" : "user:passwd@124.88.67.81:80"})
The above is The method of setting up a proxy using urllib2 seems a bit troublesome. Let's take a look at how to use reqursts to use the proxy.
Use free proxy:
import requests # 如果代理需要使用HTTP Basic Auth,可以使用下面这种格式: proxy = { "http": "mr_mao_hacker:sffqry9r@61.158.163.130:16816" } response = requests.get("http://www.baidu.com", proxies = proxy) print response.text
Note: You can write the account password into the environment variable to avoid leakage
The above is the detailed content of How to set up a proxy for Python crawler. For more information, please follow other related articles on the PHP Chinese website!

Pythonlistscanstoreanydatatype,arraymodulearraysstoreonetype,andNumPyarraysarefornumericalcomputations.1)Listsareversatilebutlessmemory-efficient.2)Arraymodulearraysarememory-efficientforhomogeneousdata.3)NumPyarraysareoptimizedforperformanceinscient

WhenyouattempttostoreavalueofthewrongdatatypeinaPythonarray,you'llencounteraTypeError.Thisisduetothearraymodule'sstricttypeenforcement,whichrequiresallelementstobeofthesametypeasspecifiedbythetypecode.Forperformancereasons,arraysaremoreefficientthanl

Pythonlistsarepartofthestandardlibrary,whilearraysarenot.Listsarebuilt-in,versatile,andusedforstoringcollections,whereasarraysareprovidedbythearraymoduleandlesscommonlyusedduetolimitedfunctionality.

ThescriptisrunningwiththewrongPythonversionduetoincorrectdefaultinterpretersettings.Tofixthis:1)CheckthedefaultPythonversionusingpython--versionorpython3--version.2)Usevirtualenvironmentsbycreatingonewithpython3.9-mvenvmyenv,activatingit,andverifying

Pythonarrayssupportvariousoperations:1)Slicingextractssubsets,2)Appending/Extendingaddselements,3)Insertingplaceselementsatspecificpositions,4)Removingdeleteselements,5)Sorting/Reversingchangesorder,and6)Listcomprehensionscreatenewlistsbasedonexistin

NumPyarraysareessentialforapplicationsrequiringefficientnumericalcomputationsanddatamanipulation.Theyarecrucialindatascience,machinelearning,physics,engineering,andfinanceduetotheirabilitytohandlelarge-scaledataefficiently.Forexample,infinancialanaly

Useanarray.arrayoveralistinPythonwhendealingwithhomogeneousdata,performance-criticalcode,orinterfacingwithCcode.1)HomogeneousData:Arrayssavememorywithtypedelements.2)Performance-CriticalCode:Arraysofferbetterperformancefornumericaloperations.3)Interf

No,notalllistoperationsaresupportedbyarrays,andviceversa.1)Arraysdonotsupportdynamicoperationslikeappendorinsertwithoutresizing,whichimpactsperformance.2)Listsdonotguaranteeconstanttimecomplexityfordirectaccesslikearraysdo.


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

MantisBT
Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

SecLists
SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.
