search
HomeBackend DevelopmentGolangDetailed explanation of how to use Golang to crawl Bing wallpapers

Detailed explanation of how to use Golang to crawl Bing wallpapers

Needless to say, just use python to make a crawler. One requests can cover the world. However, I heard that the http package built into golang is very powerful. I just don’t have to do any work, but I just want to learn new things and review the knowledge points related to the request and response of the http protocol. Not much to say, let’s start with the whole article

Climb downBing Wallpaper and try it out first. Dog head saves life Dog head saves life Dog head saves life

Overview of crawler process

graph TD
请求数据 --> 解析数据 --> 数据入库

As you can see from the flow chart above, crawlers are not troublesome in fact. The whole process is There are only three steps. Next, let’s talk about what needs to be done in each step

  • Request data: Here we need to use the built-in package http package in golang to initiate a request to the target address. This step is completed

  • Parse data: Here we need to parse the requested data, because we do not need the entire requested data, we only need some specific key data. This step is also called data cleaning

  • Data storage: It is not difficult to understand that this is to store the parsed data into the database

Practical Analysis

First go to the official website of Bing Wallpaper to observe. If you want to do a crawler, you need to be particularly sensitive to data. This is the homepage information. The whole page is very concise. Detailed explanation of how to use Golang to crawl Bing wallpapers

Next, you need to call up the browser’s developer tools (you should be very familiar with this. If you are not familiar with it, it will be difficult to follow. ). Directly press F12 or right-click to check Detailed explanation of how to use Golang to crawl Bing wallpapersDetailed explanation of how to use Golang to crawl Bing wallpapers But what? On the Bing wallpaper, right-clicking cannot call up the console and can only be called up manually. Don’t worry, just follow the first picture. If a classmate’s chrome is in Chinese, the same operation is done. Select more tools and select developer tools

No surprise, everyone must see a page like this

Detailed explanation of how to use Golang to crawl Bing wallpapersIt doesn’t matter, it’s just some anti-crawling errors on the Bing Wallpaper website. (I didn’t have this anti-crawling error when I crawled a long time ago) This does not affect our operation

Next, select this tool to help us quickly locate the element we wantDetailed explanation of how to use Golang to crawl Bing wallpapersThen we will Can find the picture information we need

Detailed explanation of how to use Golang to crawl Bing wallpapers

Code actual combat

The following is the data to crawl one page

package main

import (
    "fmt"
    "github.com/PuerkitoBio/goquery"
    "io"
    "io/ioutil"
    "log"
    "net/http"
    "os"
    "time"
)

func Run(method, url string, body io.Reader, client *http.Client) {
    req, err := http.NewRequest(method, url, body)
    if err != nil {
        log.Println("获取请求对象失败")
        return
    }
    req.Header.Set("user-agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36")
    resp, err := client.Do(req)
    if err != nil {
        log.Println("发起请求失败")
        return
    }
    if resp.StatusCode != http.StatusOK {
        log.Printf("请求失败,状态码:%d", resp.StatusCode)
        return
    }
    defer resp.Body.Close() // 关闭响应对象中的body
    query, err := goquery.NewDocumentFromReader(resp.Body)
    if err != nil {
        log.Println("生成goQuery对象失败")
        return
    }
    query.Find(".container .item").Each(func(i int, s *goquery.Selection) {
        imgUrl, _ := s.Find("a.ctrl.download").Attr("href")
        imgName := s.Find(".description>h3").Text()
        fmt.Println(imgUrl)
        fmt.Println(imgName)
        DownloadImage(imgUrl, i, client)
        time.Sleep(time.Second)
        fmt.Println("-------------------------")
    })
}

func DownloadImage(url string, index int, client *http.Client) {
    req, err := http.NewRequest("POST", url, nil)
    if err != nil {
        log.Println("获取请求对象失败")
        return
    }
    req.Header.Set("user-agent", "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36")
    resp, err := client.Do(req)
    if err != nil {
        log.Println("发起请求失败")
        return
    }
    data, err := ioutil.ReadAll(resp.Body)
    if err != nil {
        log.Println("读取请求体失败")
        return
    }
    baseDir := "./image/image-%d.jpg"
    f, err := os.OpenFile(fmt.Sprintf(baseDir, index), os.O_CREATE|os.O_TRUNC|os.O_WRONLY, 0666)
    if err != nil {
        log.Println("打开文件失败", err.Error())
        return
    }
    defer f.Close()
    _, err = f.Write(data)
    if err != nil {
        log.Println("写入数据失败")
        return
    }
    fmt.Println("下载图片成功")
}

func main() {
    client := &http.Client{}
    url := "https://bing.ioliu.cn/?p=%d"
    method := "GET"
    Run(method, url, nil, client)
}

The following is to crawl multi-page dataThe code for crawling multiple pages has not changed much. We still need to observe the characteristics of the website first

Detailed explanation of how to use Golang to crawl Bing wallpapersDiscover What happened? The first page p=1, the second page p=2, and the tenth page p=10

So we just start a for loop and then reuse the code that crawled the single page before

// 爬取多页的main函数如下
func main() {
    client := &http.Client{}
    url := "https://bing.ioliu.cn/?p=%d"
    method := "GET"
    for i := 1; i < 5; i++ { // 实现分页操作
        Run(method, fmt.Sprintf(url, i), nil, client)
    }
}

Summary

In our example, we use a third-party package of tools to parse web page data, because using regular expressions is really too troublesome

  • Use css selector: goQuery
  • Use xpath selector: htmlquery
  • Regular: built-in package, not recommended, regular rules are difficult to write

Recommended learning: Golang tutorial

The above is the detailed content of Detailed explanation of how to use Golang to crawl Bing wallpapers. For more information, please follow other related articles on the PHP Chinese website!

Statement
This article is reproduced at:掘金社区. If there is any infringement, please contact admin@php.cn delete
Learn Go String Manipulation: Working with the 'strings' PackageLearn Go String Manipulation: Working with the 'strings' PackageMay 09, 2025 am 12:07 AM

Go's "strings" package provides rich features to make string operation efficient and simple. 1) Use strings.Contains() to check substrings. 2) strings.Split() can be used to parse data, but it should be used with caution to avoid performance problems. 3) strings.Join() is suitable for formatting strings, but for small datasets, looping = is more efficient. 4) For large strings, it is more efficient to build strings using strings.Builder.

Go: String Manipulation with the Standard 'strings' PackageGo: String Manipulation with the Standard 'strings' PackageMay 09, 2025 am 12:07 AM

Go uses the "strings" package for string operations. 1) Use strings.Join function to splice strings. 2) Use the strings.Contains function to find substrings. 3) Use the strings.Replace function to replace strings. These functions are efficient and easy to use and are suitable for various string processing tasks.

Mastering Byte Slice Manipulation with Go's 'bytes' Package: A Practical GuideMastering Byte Slice Manipulation with Go's 'bytes' Package: A Practical GuideMay 09, 2025 am 12:02 AM

ThebytespackageinGoisessentialforefficientbyteslicemanipulation,offeringfunctionslikeContains,Index,andReplaceforsearchingandmodifyingbinarydata.Itenhancesperformanceandcodereadability,makingitavitaltoolforhandlingbinarydata,networkprotocols,andfileI

Learn Go Binary Encoding/Decoding: Working with the 'encoding/binary' PackageLearn Go Binary Encoding/Decoding: Working with the 'encoding/binary' PackageMay 08, 2025 am 12:13 AM

Go uses the "encoding/binary" package for binary encoding and decoding. 1) This package provides binary.Write and binary.Read functions for writing and reading data. 2) Pay attention to choosing the correct endian (such as BigEndian or LittleEndian). 3) Data alignment and error handling are also key to ensure the correctness and performance of the data.

Go: Byte Slice Manipulation with the Standard 'bytes' PackageGo: Byte Slice Manipulation with the Standard 'bytes' PackageMay 08, 2025 am 12:09 AM

The"bytes"packageinGooffersefficientfunctionsformanipulatingbyteslices.1)Usebytes.Joinforconcatenatingslices,2)bytes.Bufferforincrementalwriting,3)bytes.Indexorbytes.IndexByteforsearching,4)bytes.Readerforreadinginchunks,and5)bytes.SplitNor

Go encoding/binary package: Optimizing performance for binary operationsGo encoding/binary package: Optimizing performance for binary operationsMay 08, 2025 am 12:06 AM

Theencoding/binarypackageinGoiseffectiveforoptimizingbinaryoperationsduetoitssupportforendiannessandefficientdatahandling.Toenhanceperformance:1)Usebinary.NativeEndianfornativeendiannesstoavoidbyteswapping.2)BatchReadandWriteoperationstoreduceI/Oover

Go bytes package: short reference and tipsGo bytes package: short reference and tipsMay 08, 2025 am 12:05 AM

Go's bytes package is mainly used to efficiently process byte slices. 1) Using bytes.Buffer can efficiently perform string splicing to avoid unnecessary memory allocation. 2) The bytes.Equal function is used to quickly compare byte slices. 3) The bytes.Index, bytes.Split and bytes.ReplaceAll functions can be used to search and manipulate byte slices, but performance issues need to be paid attention to.

Go bytes package: practical examples for byte slice manipulationGo bytes package: practical examples for byte slice manipulationMay 08, 2025 am 12:01 AM

The byte package provides a variety of functions to efficiently process byte slices. 1) Use bytes.Contains to check the byte sequence. 2) Use bytes.Split to split byte slices. 3) Replace the byte sequence bytes.Replace. 4) Use bytes.Join to connect multiple byte slices. 5) Use bytes.Buffer to build data. 6) Combined bytes.Map for error processing and data verification.

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Tools

EditPlus Chinese cracked version

EditPlus Chinese cracked version

Small size, syntax highlighting, does not support code prompt function

SublimeText3 Linux new version

SublimeText3 Linux new version

SublimeText3 Linux latest version

mPDF

mPDF

mPDF is a PHP library that can generate PDF files from UTF-8 encoded HTML. The original author, Ian Back, wrote mPDF to output PDF files "on the fly" from his website and handle different languages. It is slower than original scripts like HTML2FPDF and produces larger files when using Unicode fonts, but supports CSS styles etc. and has a lot of enhancements. Supports almost all languages, including RTL (Arabic and Hebrew) and CJK (Chinese, Japanese and Korean). Supports nested block-level elements (such as P, DIV),

Safe Exam Browser

Safe Exam Browser

Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

Dreamweaver Mac version

Dreamweaver Mac version

Visual web development tools