This time I will show you how to use the nodeJs crawler, and what are the precautions for using the nodeJs crawler. The following is a practical case, let's take a look.
Background
Recently I plan to review the nodeJs-related content I have seen before, and write a few crawlers to kill the boredom, and I discovered some during the crawling process Questions, record them for future reference.Dependencies
The cheerio library that is widely available on the Internet is used to process the crawled content, superagent is used to process requests, and log4js is used to record logs.Log configuration
Without further ado, let’s go directly to the code:const log4js = require('log4js'); log4js.configure({ appenders: { cheese: { type: 'dateFile', filename: 'cheese.log', pattern: '-yyyy-MM-dd.log', // 包含模型 alwaysIncludePattern: true, maxLogSize: 1024, backups: 3 } }, categories: { default: { appenders: ['cheese'], level: 'info' } } }); const logger = log4js.getLogger('cheese'); logger.level = 'INFO'; module.exports = logger;The above directly exports a logger object and directly calls the logger in the business file. Just use .info() and other functions to add log information, and logs will be generated on a daily basis. There is a lot of relevant information on the Internet.
Crawling content and processing
superagent.get(cityItemUrl).end((err, res) => { if (err) { return console.error(err); } const $ = cheerio.load(res.text); // 解析当前页面,获取当前页面的城市链接地址 const cityInfoEle = $('.newslist1 li a'); cityInfoEle.each((idx, element) => { const $element = $(element); const sceneURL = $element.attr('href'); // 页面地址 const sceneName = $element.attr('title'); // 城市名称 if (!sceneName) { return; } logger.info(`当前解析到的目的地是: ${sceneName}, 对应的地址为: ${sceneURL}`); getDesInfos(sceneURL, sceneName); // 获取城市详细信息 ep.after('getDirInfoComplete', cityInfoEle.length, (dirInfos) => { const content = JSON.parse(fs.readFileSync(path.join(dirname, './imgs.json'))); dirInfos.forEach((element) => { logger.info(`本条数据为:${JSON.stringify(element)}`); Object.assign(content, element); }); fs.writeFileSync(path.join(dirname, './imgs.json'), JSON.stringify(content)); }); }); });Use superagent to request the page. After the request is successful, use cheerio to load the page content, and then use matching rules similar to Jquery to find the target resource. . Multiple resources are loaded, use eventproxy to proxy events, process one resource and punish one event, and process the data after all events are triggered. The above is the most basic crawler. Next are some areas that may cause problems or require special attention. . . Read and write local files
Create folder
function mkdirSync(dirname) { if (fs.existsSync(dirname)) { return true; } if (mkdirSync(path.dirname(dirname))) { fs.mkdirSync(dirname); return true; } return false; }
Read and write files
const content = JSON.parse(fs.readFileSync(path.join(dirname, './dir.json'))); dirInfos.forEach((element) => { logger.info(`本条数据为:${JSON.stringify(element)}`); Object.assign(content, element); }); fs.writeFileSync(path.join(dirname, './dir.json'), JSON.stringify(content));
Batch download resources
Downloaded resources may include pictures, audio, etc. Use Bagpipe to handle asynchronous concurrency. Refer toconst Bagpipe = require('bagpipe'); const bagpipe = new Bagpipe(10); bagpipe.push(downloadImage, url, dstpath, (err, data) => { if (err) { console.log(err); return; } console.log(`[${dstpath}]: ${data}`); });to download resources and use stream to complete file writing.
function downloadImage(src, dest, callback) { request.head(src, (err, res, body) => { if (src && src.indexOf('http') > -1 || src.indexOf('https') > -1) { request(src).pipe(fs.createWriteStream(dest)).on('close', () => { callback(null, dest); }); } }); }
Encoding
Sometimes the web page content processed directly using cheerio.load is found to be encoded text after writing to the file. You can useconst $ = cheerio.load(buf, { decodeEntities: false });to disable encoding, ps: The encoding library and iconv-lite failed to convert utf-8 encoded characters into Chinese. It may be that you are not familiar with the API. You can pay attention to it later. Finally, attach a regular pattern that matches all dom tags
const reg = /<.>/g;</.>I believe you have mastered the method after reading the case in this article. For more exciting information, please pay attention to other related articles on the PHP Chinese website! Recommended reading:
How to use js to encapsulate ajax function functions and usage
Detailed explanation of the use of common built-in functions in JS
The above is the detailed content of How to use nodeJs crawler. For more information, please follow other related articles on the PHP Chinese website!

Understanding how JavaScript engine works internally is important to developers because it helps write more efficient code and understand performance bottlenecks and optimization strategies. 1) The engine's workflow includes three stages: parsing, compiling and execution; 2) During the execution process, the engine will perform dynamic optimization, such as inline cache and hidden classes; 3) Best practices include avoiding global variables, optimizing loops, using const and lets, and avoiding excessive use of closures.

Python is more suitable for beginners, with a smooth learning curve and concise syntax; JavaScript is suitable for front-end development, with a steep learning curve and flexible syntax. 1. Python syntax is intuitive and suitable for data science and back-end development. 2. JavaScript is flexible and widely used in front-end and server-side programming.

Python and JavaScript have their own advantages and disadvantages in terms of community, libraries and resources. 1) The Python community is friendly and suitable for beginners, but the front-end development resources are not as rich as JavaScript. 2) Python is powerful in data science and machine learning libraries, while JavaScript is better in front-end development libraries and frameworks. 3) Both have rich learning resources, but Python is suitable for starting with official documents, while JavaScript is better with MDNWebDocs. The choice should be based on project needs and personal interests.

The shift from C/C to JavaScript requires adapting to dynamic typing, garbage collection and asynchronous programming. 1) C/C is a statically typed language that requires manual memory management, while JavaScript is dynamically typed and garbage collection is automatically processed. 2) C/C needs to be compiled into machine code, while JavaScript is an interpreted language. 3) JavaScript introduces concepts such as closures, prototype chains and Promise, which enhances flexibility and asynchronous programming capabilities.

Different JavaScript engines have different effects when parsing and executing JavaScript code, because the implementation principles and optimization strategies of each engine differ. 1. Lexical analysis: convert source code into lexical unit. 2. Grammar analysis: Generate an abstract syntax tree. 3. Optimization and compilation: Generate machine code through the JIT compiler. 4. Execute: Run the machine code. V8 engine optimizes through instant compilation and hidden class, SpiderMonkey uses a type inference system, resulting in different performance performance on the same code.

JavaScript's applications in the real world include server-side programming, mobile application development and Internet of Things control: 1. Server-side programming is realized through Node.js, suitable for high concurrent request processing. 2. Mobile application development is carried out through ReactNative and supports cross-platform deployment. 3. Used for IoT device control through Johnny-Five library, suitable for hardware interaction.

I built a functional multi-tenant SaaS application (an EdTech app) with your everyday tech tool and you can do the same. First, what’s a multi-tenant SaaS application? Multi-tenant SaaS applications let you serve multiple customers from a sing

This article demonstrates frontend integration with a backend secured by Permit, building a functional EdTech SaaS application using Next.js. The frontend fetches user permissions to control UI visibility and ensures API requests adhere to role-base


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment

DVWA
Damn Vulnerable Web App (DVWA) is a PHP/MySQL web application that is very vulnerable. Its main goals are to be an aid for security professionals to test their skills and tools in a legal environment, to help web developers better understand the process of securing web applications, and to help teachers/students teach/learn in a classroom environment Web application security. The goal of DVWA is to practice some of the most common web vulnerabilities through a simple and straightforward interface, with varying degrees of difficulty. Please note that this software

SublimeText3 English version
Recommended: Win version, supports code prompts!

WebStorm Mac version
Useful JavaScript development tools

SublimeText3 Linux new version
SublimeText3 Linux latest version