Let’s start with the same as before. Let’s talk about the ideas of making a crawler and the knowledge needed to prepare. Experts please ignore it.
First of all, let’s think about what we want to do and list some simple requirements.
The requirements are as follows:
1. Simulate visiting Zhihu official website (http://www.zhihu.com/)
2. Download the specified page content, including: Today’s hottest, This month’s hottest, Editor’s recommendation
3. Download all questions and answers in the specified categories, such as: investment, programming, failing courses
4. Download all answers from the specified respondent
5. It would be better to have a one-click like function (so that I can I was so smart to like all of Lei Lun’s answers at once!)
The technical problems that need to be solved are briefly listed as follows:
1. Simulate the browser to access the webpage
2. Grab key data and save it Go locally
3. Solve the dynamic loading problem in web browsing
4. Use a tree structure to massively crawl all content on Zhihu
Okay, that’s all I have thought about now.
The next step is preparation.
1. Determine the crawler language: Since I have written a series of crawler tutorials before (click here), Baidu Tieba, Embarrassing Encyclopedia, Shandong University's grade point query, etc. are all written in python, so I decided to use Java to write it this time ( Hello, there is absolutely no half cent to contact me, okay).
2. Popular science about crawlers: Web crawler, or Web Spider, is a very vivid name. If the Internet is compared to a spider web, then a spider is a spider crawling around on the web. Web spiders search for web pages through their link addresses. For a detailed introduction, please click here.
3. Prepare the crawler environment: I won’t go into details about the installation and configuration of Jdk and Eclipse. A long-winded sentence here, a useful browser is very important for crawlers, because first you need to browse the web yourself to know where the things you need are, and then you can tell your crawlers where to go and how to crawl. I personally recommend Firefox or Google Chrome. Their functions of right-clicking to inspect elements and viewing source code are very powerful.
Now we start the official crawler journey! ~What should I talk about specifically, well, this is a question, let me think about it, don’t worry^_^
The above is the preparation work for writing Java Zhihu crawler from scratch. For more related content, please Follow the PHP Chinese website (www.php.cn)!

Cloud computing significantly improves Java's platform independence. 1) Java code is compiled into bytecode and executed by the JVM on different operating systems to ensure cross-platform operation. 2) Use Docker and Kubernetes to deploy Java applications to improve portability and scalability.

Java'splatformindependenceallowsdeveloperstowritecodeonceandrunitonanydeviceorOSwithaJVM.Thisisachievedthroughcompilingtobytecode,whichtheJVMinterpretsorcompilesatruntime.ThisfeaturehassignificantlyboostedJava'sadoptionduetocross-platformdeployment,s

Containerization technologies such as Docker enhance rather than replace Java's platform independence. 1) Ensure consistency across environments, 2) Manage dependencies, including specific JVM versions, 3) Simplify the deployment process to make Java applications more adaptable and manageable.

JRE is the environment in which Java applications run, and its function is to enable Java programs to run on different operating systems without recompiling. The working principle of JRE includes JVM executing bytecode, class library provides predefined classes and methods, configuration files and resource files to set up the running environment.

JVM ensures efficient Java programs run through automatic memory management and garbage collection. 1) Memory allocation: Allocate memory in the heap for new objects. 2) Reference count: Track object references and detect garbage. 3) Garbage recycling: Use the tag-clear, tag-tidy or copy algorithm to recycle objects that are no longer referenced.

Start Spring using IntelliJIDEAUltimate version...

When using MyBatis-Plus or other ORM frameworks for database operations, it is often necessary to construct query conditions based on the attribute name of the entity class. If you manually every time...

Java...


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Atom editor mac version download
The most popular open source editor

SublimeText3 English version
Recommended: Win version, supports code prompts!

mPDF
mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

DVWA
Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

MinGW - Minimalist GNU for Windows
This project is in the process of being migrated to osdn.net/projects/mingw, you can continue to follow us there. MinGW: A native Windows port of the GNU Compiler Collection (GCC), freely distributable import libraries and header files for building native Windows applications; includes extensions to the MSVC runtime to support C99 functionality. All MinGW software can run on 64-bit Windows platforms.