DEV Community

Wilbur Suero
Wilbur Suero

Posted on

Concurrency in Go: A Rails Developer’s First Encounter with Goroutines

Coming from a Ruby on Rails background, one of the most striking differences when switching to Go is how it handles concurrency. Rails applications often rely on external tools like Sidekiq, Resque, or Thread-based parallelism to manage background jobs and concurrent tasks. Go, on the other hand, has concurrency built into the language itself, using lightweight goroutines and channels.

In this post, I’ll walk through the key differences between Go’s concurrency model and Ruby’s threading model, why Go’s approach is so powerful, and how you can start writing concurrent Go programs.


Understanding Concurrency in Go

Go was designed from the ground up to support concurrency as a first-class feature. The key components of its concurrency model are:

1. Goroutines: Lightweight, Managed Threads

A goroutine is a function that runs concurrently with other functions. Unlike system threads, goroutines are extremely lightweight because they are managed by the Go runtime, not the OS. They start with just a few kilobytes of stack space and grow as needed.

Starting a goroutine is as simple as using the go keyword:

package main

import (
    "fmt"
    "time"
)

func sayHello() {
    fmt.Println("Hello from a Goroutine!")
}

func main() {
    go sayHello() // This runs concurrently
    time.Sleep(time.Second) // Prevent main from exiting immediately
}
Enter fullscreen mode Exit fullscreen mode

Here, sayHello() runs in a separate goroutine. The main function needs a Sleep to ensure the goroutine gets time to execute before the program exits.

2. Channels: Safe Communication Between Goroutines

Goroutines don’t share memory by default. Instead, they communicate using channels, which provide a safe way to pass data between them.

package main

import "fmt"

func main() {
    ch := make(chan string)

    go func() {
        ch <- "Hello, World!" // Send data into the channel
    }()

    message := <-ch // Receive data from the channel
    fmt.Println(message)
}
Enter fullscreen mode Exit fullscreen mode

This approach eliminates many of the pitfalls of traditional multi-threading, such as race conditions and complex locking mechanisms.


How Ruby Handles Concurrency

Ruby’s concurrency model is different. By default, Ruby threads are OS-managed and can be affected by the Global Interpreter Lock (GIL) in MRI (Matz’s Ruby Interpreter). This means only one thread executes at a time, limiting true parallelism.

1. Ruby Threads: OS-Managed and Heavyweight

Ruby supports threads, but they are heavier than Go’s goroutines. Here’s a simple example of spawning a thread in Ruby:

thread = Thread.new do
  puts "Hello from a Thread!"
end

thread.join # Wait for the thread to finish
Enter fullscreen mode Exit fullscreen mode

While this works, Ruby threads consume more memory than goroutines and are subject to OS scheduling.

2. Background Jobs: Offloading Work

Since Ruby’s threading model isn’t great for high concurrency, Rails developers often turn to background job processing frameworks like Sidekiq or Resque, which rely on Redis and external worker processes to handle concurrency.

Example of Sidekiq in Rails:

class HardWorker
  include Sidekiq::Worker

  def perform(name, count)
    puts "Doing hard work for #{name} #{count} times!"
  end
end

HardWorker.perform_async("John", 5)
Enter fullscreen mode Exit fullscreen mode

While these tools work well, they introduce extra dependencies and infrastructure complexity compared to Go’s built-in concurrency model.


Key Differences: Go vs. Ruby Concurrency

Feature Go (Goroutines & Channels) Ruby (Threads & Background Jobs)
Lightweight Yes, goroutines are extremely lightweight No, Ruby threads are OS-managed and heavier
Managed by Go runtime OS scheduler
Parallel Execution Yes, true parallelism possible Limited due to GIL in MRI
Built-in Concurrency Yes, goroutines & channels are first-class citizens No, relies on third-party solutions like Sidekiq
Memory Usage Low, goroutines share heap efficiently High, OS threads are heavier
Scalability Scales well due to efficient concurrency model Requires external tools for large-scale concurrency

When to Use Concurrency in Go

Now that you see how Go’s concurrency model differs, here are some real-world scenarios where goroutines and channels shine:

  • Handling thousands of simultaneous requests in a web server
  • Streaming data processing (e.g., real-time analytics)
  • Efficient background job execution (e.g., email processing, batch jobs)
  • High-performance APIs with non-blocking I/O

Here’s an example of a simple HTTP server in Go that handles requests concurrently using goroutines:

package main

import (
    "fmt"
    "net/http"
)

func handler(w http.ResponseWriter, r *http.Request) {
    fmt.Fprintf(w, "Hello, %s!", r.URL.Path[1:])
}

func main() {
    http.HandleFunc("/", handler)
    go http.ListenAndServe(":8080", nil) // Runs in a goroutine
    select {} // Keep the program running
}
Enter fullscreen mode Exit fullscreen mode

Each request runs in its own goroutine, making the server highly scalable.


For a Rails developer transitioning to Go, the built-in support for concurrency is a game-changer. Go’s goroutines and channels eliminate much of the complexity involved in traditional multi-threading. Unlike Ruby, where you often rely on background job queues and OS-managed threads, Go lets you write highly concurrent applications with minimal dependencies and better performance.

If you're coming from Rails, expect some initial adjustment when learning Go’s concurrency patterns. However, once you get used to goroutines, channels, and Go’s structured simplicity, you’ll see why Go has become the go-to language for scalable and high-performance systems.

Top comments (0)