search
HomeBackend DevelopmentGolangBuilding a Web Search Engine in Go with Elasticsearch

Web search engines are essential for indexing vast amounts of online information, making it accessible in milliseconds. In this project, I built a search engine in Go (Golang) named RelaxSearch. It combines web scraping, periodic data indexing, and search functionality by integrating with Elasticsearch—a powerful search and analytics engine. In this blog, I’ll walk you through the main components of RelaxSearch, the architecture, and how it efficiently scrapes and indexes data for fast, keyword-based search.

Overview of RelaxSearch

RelaxSearch is built around two primary modules:

  1. RelaxEngine: A web scraper powered by cron jobs, which periodically crawls specified websites, extracts content, and indexes it in Elasticsearch.
  2. RelaxWeb: A RESTful API server that allows users to search the indexed data, providing pagination, filtering, and content highlighting for user-friendly responses.

Project Motivation

Creating a search engine project from scratch is a great way to understand web scraping, data indexing, and efficient search techniques. I wanted to create a simple but functional search engine with fast data retrieval and easy extensibility, utilizing Go’s efficiency and Elasticsearch’s powerful indexing.

Key Features

  • Automated Crawling: Using cron jobs, RelaxEngine can run at regular intervals, scraping data and storing it in Elasticsearch.
  • Full-Text Search: RelaxWeb provides full-text search capability, indexing content by keywords, making retrieval fast.
  • REST API: Accessible through a RESTful API with parameters for pagination, date filtering, and content highlighting.
  • Data Storage: The indexed content is stored in Elasticsearch, allowing for scalable and highly responsive queries.

Architecture of RelaxSearch

1. RelaxEngine (Web Scraper and Indexer)

RelaxEngine is a web scraper written in Go that navigates web pages, extracting and storing content. It runs as a cron job, so it can operate at regular intervals (e.g., every 30 minutes) to keep the index updated with the latest web data. Here’s how it works:

  • Seed URL: RelaxEngine starts scraping from a specified seed URL and then follows links within the site up to a configurable depth.
  • Content Parsing: For each page, it extracts titles, descriptions, and keywords, constructing an informative dataset.
  • Indexing in Elasticsearch: The scraped content is indexed in Elasticsearch, ready for full-text search. Each page's data is stored with a unique identifier, title, description, and other metadata.

2. RelaxWeb (Search API)

RelaxWeb provides a RESTful API endpoint, making it easy to query and retrieve data stored in Elasticsearch. The API accepts several parameters, such as keywords, pagination, and date filtering, returning relevant content in JSON format.

  • API Endpoint: /search
  • Query Parameters:
    • keyword: Main search term.
    • from and size: Pagination control.
    • dateRangeStart and dateRangeEnd: Filter results based on the timestamp of data.

Building a Web Search Engine in Go with Elasticsearch

Key Components and Code Snippets

Below are some important components and code excerpts from RelaxSearch to illustrate how it works.

Main Go Code for RelaxEngine

The core functionality is in the main.go file, where RelaxEngine initializes a scheduler using gocron to manage cron jobs, sets up the Elasticsearch client, and begins crawling from the seed URL.

func main() {
    cfg := config.LoadConfig()
    esClient := crawler.NewElasticsearchClient(cfg.ElasticsearchURL)
    c := crawler.NewCrawler(cfg.DepthLimit, 5)
    seedURL := "https://example.com/" // Replace with starting URL

    s := gocron.NewScheduler(time.UTC)
    s.Every(30).Minutes().Do(func() {
        go c.StartCrawling(seedURL, 0, esClient)
    })
    s.StartBlocking()
}

Crawler and Indexing Logic

The crawler.go file handles web page requests, extracts content, and indexes it. Using the elastic package, each scraped page is stored in Elasticsearch.

func (c *Crawler) StartCrawling(pageURL string, depth int, esClient *elastic.Client) {
    if depth > c.DepthLimit || c.isVisited(pageURL) {
        return
    }
    c.markVisited(pageURL)
    links, title, content, description, err := c.fetchAndParsePage(pageURL)
    if err == nil {
        pageData := PageData{URL: pageURL, Title: title, Content: content, Description: description}
        IndexPageData(esClient, pageData)
    }
    for _, link := range links {
        c.StartCrawling(link, depth+1, esClient)
    }
}

Search API Code in RelaxWeb

In relaxweb service, an API endpoint provides full-text search capabilities. The endpoint /search receives requests and queries Elasticsearch, returning relevant content based on keywords.

func searchHandler(w http.ResponseWriter, r *http.Request) {
    keyword := r.URL.Query().Get("keyword")
    results := queryElasticsearch(keyword)
    json.NewEncoder(w).Encode(results)
}

Setting Up RelaxSearch

  1. Clone the Repository
   git clone https://github.com/Ravikisha/RelaxSearch.git
   cd RelaxSearch
  1. Configuration

    Update .env files for both RelaxEngine and RelaxWeb with Elasticsearch credentials.

  2. Run with Docker

    RelaxSearch uses Docker for easy setup. Simply run:

   docker-compose up --build

Building a Web Search Engine in Go with Elasticsearch

Building a Web Search Engine in Go with Elasticsearch

Building a Web Search Engine in Go with Elasticsearch

Challenges and Improvements

  • Scalability: Elasticsearch scales well, but handling extensive scraping with numerous links requires optimizations for larger-scale deployments.
  • Robust Error Handling: Enhancing error handling and retry mechanisms would increase resilience.

Conclusion

RelaxSearch is an educational and practical demonstration of a basic search engine. While it is still a prototype, this project has been instrumental in understanding the fundamentals of web scraping, full-text search, and efficient data indexing with Go and Elasticsearch. It opens avenues for improvements and real-world application in scalable environments.

Explore the GitHub repository to try out RelaxSearch for yourself!

The above is the detailed content of Building a Web Search Engine in Go with Elasticsearch. For more information, please follow other related articles on the PHP Chinese website!

Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Mastering Go Strings: A Deep Dive into the 'strings' PackageMastering Go Strings: A Deep Dive into the 'strings' PackageMay 12, 2025 am 12:05 AM

You should care about the "strings" package in Go because it provides tools for handling text data, splicing from basic strings to advanced regular expression matching. 1) The "strings" package provides efficient string operations, such as Join functions used to splice strings to avoid performance problems. 2) It contains advanced functions, such as the ContainsAny function, to check whether a string contains a specific character set. 3) The Replace function is used to replace substrings in a string, and attention should be paid to the replacement order and case sensitivity. 4) The Split function can split strings according to the separator and is often used for regular expression processing. 5) Performance needs to be considered when using, such as

'encoding/binary' Package in Go: Your Go-To for Binary Operations'encoding/binary' Package in Go: Your Go-To for Binary OperationsMay 12, 2025 am 12:03 AM

The"encoding/binary"packageinGoisessentialforhandlingbinarydata,offeringtoolsforreadingandwritingbinarydataefficiently.1)Itsupportsbothlittle-endianandbig-endianbyteorders,crucialforcross-systemcompatibility.2)Thepackageallowsworkingwithcus

Go Byte Slice Manipulation Tutorial: Mastering the 'bytes' PackageGo Byte Slice Manipulation Tutorial: Mastering the 'bytes' PackageMay 12, 2025 am 12:02 AM

Mastering the bytes package in Go can help improve the efficiency and elegance of your code. 1) The bytes package is crucial for parsing binary data, processing network protocols, and memory management. 2) Use bytes.Buffer to gradually build byte slices. 3) The bytes package provides the functions of searching, replacing and segmenting byte slices. 4) The bytes.Reader type is suitable for reading data from byte slices, especially in I/O operations. 5) The bytes package works in collaboration with Go's garbage collector, improving the efficiency of big data processing.

How do you use the 'strings' package to manipulate strings in Go?How do you use the 'strings' package to manipulate strings in Go?May 12, 2025 am 12:01 AM

You can use the "strings" package in Go to manipulate strings. 1) Use strings.TrimSpace to remove whitespace characters at both ends of the string. 2) Use strings.Split to split the string into slices according to the specified delimiter. 3) Merge string slices into one string through strings.Join. 4) Use strings.Contains to check whether the string contains a specific substring. 5) Use strings.ReplaceAll to perform global replacement. Pay attention to performance and potential pitfalls when using it.

How to use the 'bytes' package to manipulate byte slices in Go (step by step)How to use the 'bytes' package to manipulate byte slices in Go (step by step)May 12, 2025 am 12:01 AM

ThebytespackageinGoishighlyeffectiveforbyteslicemanipulation,offeringfunctionsforsearching,splitting,joining,andbuffering.1)Usebytes.Containstosearchforbytesequences.2)bytes.Splithelpsbreakdownbyteslicesusingdelimiters.3)bytes.Joinreconstructsbytesli

GO bytes package: What are the alternatives?GO bytes package: What are the alternatives?May 11, 2025 am 12:11 AM

ThealternativestoGo'sbytespackageincludethestringspackage,bufiopackage,andcustomstructs.1)Thestringspackagecanbeusedforbytemanipulationbyconvertingbytestostringsandback.2)Thebufiopackageisidealforhandlinglargestreamsofbytedataefficiently.3)Customstru

Manipulating Byte Slices in Go: The Power of the 'bytes' PackageManipulating Byte Slices in Go: The Power of the 'bytes' PackageMay 11, 2025 am 12:09 AM

The"bytes"packageinGoisessentialforefficientlymanipulatingbyteslices,crucialforbinarydata,networkprotocols,andfileI/O.ItoffersfunctionslikeIndexforsearching,Bufferforhandlinglargedatasets,Readerforsimulatingstreamreading,andJoinforefficient

Go Strings Package: A Comprehensive Guide to String ManipulationGo Strings Package: A Comprehensive Guide to String ManipulationMay 11, 2025 am 12:08 AM

Go'sstringspackageiscrucialforefficientstringmanipulation,offeringtoolslikestrings.Split(),strings.Join(),strings.ReplaceAll(),andstrings.Contains().1)strings.Split()dividesastringintosubstrings;2)strings.Join()combinesslicesintoastring;3)strings.Rep

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

Video Face Swap

Video Face Swap

Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

SublimeText3 English version

SublimeText3 English version

Recommended: Win version, supports code prompts!

Safe Exam Browser

Safe Exam Browser

Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

SecLists

SecLists

SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.

Notepad++7.3.1

Notepad++7.3.1

Easy-to-use and free code editor

PhpStorm Mac version

PhpStorm Mac version

The latest (2018.2.1) professional PHP integrated development tool