Home > Article > Backend Development > Sharing of practical tips for concurrent programming in Golang: giving full play to the advantages of Goroutines
Golang Concurrent Programming Practical Tips Sharing: Give full play to the advantages of Goroutines
In the Go language, Goroutines is a lightweight thread implementation, which makes concurrent programming very simple and efficient. By giving full play to the advantages of Goroutines, we can better utilize multi-core processors and improve program performance and throughput. This article will share some practical tips to help you better use Goroutines for concurrent programming.
1. Solutions to concurrency problems
In concurrent programming, the most common problem is concurrent access to shared resources. To solve this problem, we can use a mutex or channel to protect access to shared resources.
The mutex lock ensures that only one Goroutine can access shared resources at the same time, and other Goroutines need to wait for the lock to be released before they can access it. The following is a simple sample code:
package main import ( "fmt" "sync" ) var ( counter int mutex sync.Mutex wg sync.WaitGroup ) func main() { wg.Add(2) go increment(1) go increment(2) wg.Wait() fmt.Println("counter:", counter) } func increment(id int) { defer wg.Done() for i := 0; i < 100000; i++ { mutex.Lock() counter++ mutex.Unlock() } }
In the above code, we use sync.Mutex
to create a mutex lock. In the increment
function, before each modification to the shared resource counter
, we first call the Lock
method to lock the mutex, and then call Unlock
Method to unlock. This ensures that only one Goroutine is modifying counter
at the same time.
A channel is a data structure that can be used to communicate between Goroutines, which can achieve synchronization and transfer data. Through channels, we can safely share access to resources and avoid race conditions.
Here is a sample code using a channel:
package main import ( "fmt" "sync" ) var ( counter int wg sync.WaitGroup ) func main() { ch := make(chan int) wg.Add(2) go increment(1, ch) go increment(2, ch) wg.Wait() close(ch) for count := range ch { counter += count } fmt.Println("counter:", counter) } func increment(id int, ch chan int) { defer wg.Done() for i := 0; i < 100000; i++ { ch <- 1 } }
In the above code, we create a buffered channel ch
and pass the integer value 1 through the channel . In the increment
function, we send a 1 to channel ch
on each iteration. In the main
function, we use range
to receive the integer value from the channel and then accumulate it into the counter
.
2. Avoid Goroutine leakage
In concurrent programming, Goroutine leakage is a common problem. If Goroutine is not closed correctly after creation, it will lead to waste of resources and performance degradation.
In order to avoid Goroutine leaks, we can use the context
package for coroutine control and cancellation. Here is the sample code:
package main import ( "context" "fmt" "sync" "time" ) var wg sync.WaitGroup func main() { ctx := context.Background() ctx, cancel := context.WithCancel(ctx) wg.Add(1) go worker(ctx) time.Sleep(3 * time.Second) cancel() wg.Wait() fmt.Println("main function exit") } func worker(ctx context.Context) { defer wg.Done() for { select { case <-ctx.Done(): fmt.Println("worker cancelled") return default: fmt.Println("worker is running") } time.Sleep(1 * time.Second) } }
In the above code, we have created a context with cancellation functionality using context.Background
and context.WithCancel
. In the main
function, we start a Goroutine to execute the worker
function and pass the context. In the worker
function, we determine whether we need to exit by constantly listening to the context's cancellation signal. Once the cancellation signal is received, we close the Goroutine and output the corresponding log.
By using the context
package, we can better control the Goroutine life cycle and resource release, avoiding Goroutine leaks.
3. Parallel execution of tasks
In actual applications, we often need to execute multiple tasks in parallel, and then wait for all tasks to be completed before proceeding to the next step. At this time, we can use sync.WaitGroup
and channel
to achieve this.
The following is a sample code for executing tasks in parallel:
package main import ( "fmt" "sync" ) var wg sync.WaitGroup func main() { tasks := make(chan int, 10) wg.Add(3) go worker(1, tasks) go worker(2, tasks) go worker(3, tasks) for i := 0; i < 10; i++ { tasks <- i } close(tasks) wg.Wait() fmt.Println("all tasks done") } func worker(id int, tasks chan int) { defer wg.Done() for task := range tasks { fmt.Printf("worker %d: processing task %d ", id, task) } }
In the above code, we create a channel with a buffer of 10tasks
, and then start 3 A Goroutine to execute the worker
function. In the main
function, we send 10 tasks into the channel through a loop and then close the channel. In the worker
function, we take out the task from the channel and output the corresponding log.
By executing tasks in parallel, we can make full use of multi-core processors and speed up program execution.
Summary
By giving full play to the advantages of Goroutines, we can better perform concurrent programming. When solving the problem of concurrent access to shared resources, we can use mutexes or channels to protect access to shared resources. At the same time, we also need to pay attention to avoid Goroutine leaks and reasonably control the Goroutine life cycle and resource release. When we need to execute tasks in parallel, we can use sync.WaitGroup
and channel
to achieve this.
By using these techniques appropriately, we can improve the performance and throughput of the program while ensuring the correctness and stability of the program. I hope this article will be helpful to you when using Goroutines for concurrent programming.
The above is the detailed content of Sharing of practical tips for concurrent programming in Golang: giving full play to the advantages of Goroutines. For more information, please follow other related articles on the PHP Chinese website!