Home >Backend Development >Golang >How can you limit the number of concurrent Go routines when processing a list of URLs while utilizing goroutines?

How can you limit the number of concurrent Go routines when processing a list of URLs while utilizing goroutines?

Mary-Kate Olsen
Mary-Kate OlsenOriginal
2024-10-31 11:31:02452browse

How can you limit the number of concurrent Go routines when processing a list of URLs while utilizing goroutines?

Limiting the Number of Concurrent Go Routines

Issue:
You intend to process a list of URLs simultaneously using goroutines but with a predefined maximum number of goroutines executing concurrently. In this instance, you want to restrict the parallelization to ten goroutines simultaneously, despite having thirty URLs.

Solution:

The key to resolving this issue lies in modifying the architecture of your code. Instead of spawning a separate goroutine for each URL, generate a limited number of worker goroutines that consume URLs from a shared channel. The buffered nature of this channel will regulate the concurrency.

Code Modification:

Here's an updated version of your code that incorporates this approach:

<code class="go">package main

import (
    "flag"
    "fmt"
    "os"
    "sync"
    "time"
)

func main() {
    parallel := flag.Int("parallel", 10, "max parallel requests allowed")
    flag.Parse()
    urls := flag.Args()

    // Create a buffered channel to buffer URLs
    urlsChan := make(chan string, *parallel)

    // Create a separate goroutine to feed URLs into the channel
    go func() {
        for _, u := range urls {
            urlsChan <- u
        }
        // Close the channel to indicate that there are no more URLs to process
        close(urlsChan)
    }()

    var wg sync.WaitGroup
    client := rest.Client{}

    results := make(chan string)

    // Start the specified number of worker goroutines
    for i := 0; i < *parallel; i++ {
        wg.Add(1)
        go func() {
            defer wg.Done()
            // Continuously retrieve URLs from the channel until it is closed
            for url := range urlsChan {
                worker(url, client, results)
            }
        }()
    }

    // Launch a separate goroutine to close the results channel when all workers are finished
    go func() {
        // Wait for all workers to finish processing URLs
        wg.Wait()
        // Close the results channel to signal that there are no more results
        close(results)
    }()

    // Read results from the channel until it is closed
    for res := range results {
        fmt.Println(res)
    }

    os.Exit(0)
}</code>

In this revised code:

  1. A buffered channel, urlsChan, is created to hold the URLs to be processed. The buffer size is set to *parallel, effectively limiting the number of goroutines that can simultaneously access the channel.
  2. A separate goroutine is dedicated to populating the urlsChan channel with URLs.
  3. The worker goroutines continuously consume URLs from the urlsChan channel until it is closed.
  4. A separate goroutine is employed to close the results channel once all workers have completed their tasks.

By leveraging this modified architecture, you can effectively regulate the number of goroutines executing concurrently based on the specified parallelism limit.

The above is the detailed content of How can you limit the number of concurrent Go routines when processing a list of URLs while utilizing goroutines?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn