The problem I'm facing is that even trying just 200 requests causes the program to occupy 6gb of the container's memory and ends up being oom kill. My idea is to extract all text nodes present in the html and then process them to extract their name, the html and the text of that tag. So, to generate html for a specific tag, I use the render function from golang.org/x/net/html. Where I provide strings.builder as io.writer to write the generated html. But for some reason the builder takes too much memory.
package main import ( "encoding/csv" "io" "log" "net/http" "strings" "golang.org/x/net/html" ) func main() { mux := http.NewServeMux() mux.HandleFunc("/data", GetData) if err := http.ListenAndServe(":8001", mux); err != nil { log.Println(err) } } type TagInfo struct { Tag string Name string Text string } // http.handler func GetData(w http.ResponseWriter, r *http.Request) { u := r.URL.Query().Get("url") doc, err := GetDoc(u) if err != nil { log.Println(err) w.WriteHeader(500) return } var buf strings.Builder data := Extract(doc, &buf) csvw := csv.NewWriter(io.Discard) for _, d := range data { csvw.Write([]string{d.Name, d.Tag, d.Text}) } } // fires request and get text/html func GetDoc(u string) (*html.Node, error) { res, err := http.Get(u) if err != nil { return nil, err } defer res.Body.Close() return html.Parse(res.Body) } func Extract(doc *html.Node, buf *strings.Builder) []TagInfo { var ( tags = make([]TagInfo, 0, 100) f func(*html.Node) ) f = func(n *html.Node) { if n.Type == html.TextNode { text := strings.TrimSpace(n.Data) if text != "" { parent := n.Parent tag := Render(parent, buf) tagInfo := TagInfo{ Tag: tag, Name: parent.Data, Text: n.Data, } tags = append(tags, tagInfo) } } for child := n.FirstChild; child != nil; child = child.NextSibling { f(child) } } f(doc) return tags } // Render the html around the tag // if node is text then pass the // parent node paramter in function func Render(n *html.Node, buf *strings.Builder) string { defer buf.Reset() if err := html.Render(buf, n); err != nil { log.Println(err) return "" } return buf.String() }
If you want a specific list of URLs, here it is. I made about 60 requests at one time.
I tried using bytes.buffer bytes.buffer
and sync.pool
but both have the same problem. Using pprof
I noticed that the writestring method of
strings.builder caused a lot of memory usage.
Correct answer
So the basic question here is to accept any content-type
, which is not acceptable in terms of crawling, most Websites need to send text/html
.
The problem is that even if the url sends anything that does not represent html data golang.org/x/net/html
it is still accepted without throwing an error.
Let's take an example where application/pdf
is returned, then the body will contain html.Parse
the binary data of the parsed pdf and no error will be returned, this is Weird-behavior library for scraping/crawling accepted binary data.
The solution is: Check the response headers, if only the data is html, then continue, otherwise there will be ambiguity or higher memory usage (maybe lower), but we can't predict what will happen occur.
The above is the detailed content of html rendering function memory leak. For more information, please follow other related articles on the PHP Chinese website!

Go's "strings" package provides rich features to make string operation efficient and simple. 1) Use strings.Contains() to check substrings. 2) strings.Split() can be used to parse data, but it should be used with caution to avoid performance problems. 3) strings.Join() is suitable for formatting strings, but for small datasets, looping = is more efficient. 4) For large strings, it is more efficient to build strings using strings.Builder.

Go uses the "strings" package for string operations. 1) Use strings.Join function to splice strings. 2) Use the strings.Contains function to find substrings. 3) Use the strings.Replace function to replace strings. These functions are efficient and easy to use and are suitable for various string processing tasks.

ThebytespackageinGoisessentialforefficientbyteslicemanipulation,offeringfunctionslikeContains,Index,andReplaceforsearchingandmodifyingbinarydata.Itenhancesperformanceandcodereadability,makingitavitaltoolforhandlingbinarydata,networkprotocols,andfileI

Go uses the "encoding/binary" package for binary encoding and decoding. 1) This package provides binary.Write and binary.Read functions for writing and reading data. 2) Pay attention to choosing the correct endian (such as BigEndian or LittleEndian). 3) Data alignment and error handling are also key to ensure the correctness and performance of the data.

The"bytes"packageinGooffersefficientfunctionsformanipulatingbyteslices.1)Usebytes.Joinforconcatenatingslices,2)bytes.Bufferforincrementalwriting,3)bytes.Indexorbytes.IndexByteforsearching,4)bytes.Readerforreadinginchunks,and5)bytes.SplitNor

Theencoding/binarypackageinGoiseffectiveforoptimizingbinaryoperationsduetoitssupportforendiannessandefficientdatahandling.Toenhanceperformance:1)Usebinary.NativeEndianfornativeendiannesstoavoidbyteswapping.2)BatchReadandWriteoperationstoreduceI/Oover

Go's bytes package is mainly used to efficiently process byte slices. 1) Using bytes.Buffer can efficiently perform string splicing to avoid unnecessary memory allocation. 2) The bytes.Equal function is used to quickly compare byte slices. 3) The bytes.Index, bytes.Split and bytes.ReplaceAll functions can be used to search and manipulate byte slices, but performance issues need to be paid attention to.

The byte package provides a variety of functions to efficiently process byte slices. 1) Use bytes.Contains to check the byte sequence. 2) Use bytes.Split to split byte slices. 3) Replace the byte sequence bytes.Replace. 4) Use bytes.Join to connect multiple byte slices. 5) Use bytes.Buffer to build data. 6) Combined bytes.Map for error processing and data verification.


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function

ZendStudio 13.5.1 Mac
Powerful PHP integrated development environment

PhpStorm Mac version
The latest (2018.2.1) professional PHP integrated development tool

WebStorm Mac version
Useful JavaScript development tools
