DEV Community

Cover image for 8 Type of Load Balancing
Ravin Rau
Ravin Rau

Posted on

8 Type of Load Balancing

If you're diving into the world of web infrastructure, you've probably heard about load balancing. It's like the traffic cop of the internet, making sure all those data requests get to the right place without causing a jam. In this article, we'll break down some popular load-balancing techniques and show you how to set them up using NGINX. Share your favorite load-balancing strategy in the comments and tell us how it helped solve your problem.

1. Round Robin

Round Robin

When to Use It: Perfect for spreading requests evenly when your servers are all pretty similar.

What's It About: Think of it like taking turns. Each server gets a request in order, one after the other. It's simple and works great when all your servers are equally capable.

Downside: Doesn't account for server load or capacity differences, which can lead to uneven performance if servers vary in power.

How to Set It Up in NGINX:

upstream backend {
    server server1.example.com;
    server server2.example.com;
    server server3.example.com;
}
Enter fullscreen mode Exit fullscreen mode

2. Least Connection

Least Connection

When to Use It: Great for when some servers are busier than others.

What's It About: This one sends traffic to the server with the fewest active connections. It's like choosing the shortest line at the grocery store.

Downside: Can lead to uneven distribution if some servers are slower or have less capacity, as they might still end up with more connections.

How to Set It Up in NGINX:

upstream backend {
    least_conn;
    server server1.example.com;
    server server2.example.com;
    server server3.example.com;
}
Enter fullscreen mode Exit fullscreen mode

3. Weighted Round Robin

Weighted Round Robin

When to Use It: Handy when your servers have different strengths.

What's It About: Similar to Round Robin, but you can give some servers more "turns" based on their capacity.

Downside: Requires manual configuration and tuning of weights, which can be complex and needs regular adjustments as server loads change.

How to Set It Up in NGINX:

upstream backend {
    server server1.example.com weight=3;
    server server2.example.com weight=1;
    server server3.example.com weight=2;
}
Enter fullscreen mode Exit fullscreen mode

4. Weighted Least Connection

Weighted Least Connection

When to Use It: Best for mixed environments with varying server loads and capabilities.

What's It About: Combines the best of both worlds—Least Connection and Weighted Round Robin.

Downside: Like Weighted Round Robin, it requires careful configuration and monitoring to ensure weights are set correctly.

How to Set It Up in NGINX:

upstream backend {
    least_conn;
    server server1.example.com weight=3;
    server server2.example.com weight=1;
    server server3.example.com weight=2;
}
Enter fullscreen mode Exit fullscreen mode

5. IP Hash

IP Hash

When to Use It: Perfect for keeping users connected to the same server.

What's It About: Uses the client's IP address to decide which server to use, ensuring consistency.

Downside: Can lead to uneven distribution if a large number of users share the same IP range, and doesn't handle server failures gracefully.

How to Set It Up in NGINX:

upstream backend {
    ip_hash;
    server server1.example.com;
    server server2.example.com;
    server server3.example.com;
}
Enter fullscreen mode Exit fullscreen mode

6. Least Response Time

Least Response Time

When to Use It: Ideal when speed is everything.

What's It About: Sends requests to the server that responds the fastest. NGINX doesn't support this out of the box, but you can use some third-party magic like Nginx Upstream Fair Module..

Downside: Requires additional monitoring and third-party modules, which can add complexity and potential points of failure.


7. Random

Random

When to Use It: Good for testing or when you just want to mix things up.

What's It About: Randomly picks a server for each request. Again, you'll need a third-party module for this like Nginx Random Load Balancer Module.

Downside: Can lead to uneven load distribution and isn't suitable for production environments where performance is critical.


8. Least Bandwidth

Least Bandwidth

When to Use It: Useful when bandwidth usage is all over the place.

What's It About: Directs traffic to the server using the least bandwidth. For this one, you'll need some custom setup like custom scripts or monitoring tools.

Downside: Requires custom monitoring and setup, which can be complex and resource-intensive.


Other Cool Load Balancing Tricks

  1. Geolocation-Based: Directs traffic based on where users are located. Great for reducing latency.
  2. Consistent Hashing: Keeps requests going to the same server, even if the server pool changes. Perfect for caching systems.
  3. Custom Load Balancing: Tailor it to your needs with custom scripts or Lua scripting in NGINX.

Conclusion

Choosing the right load-balancing strategy is all about understanding your app's needs. NGINX is super flexible and can easily handle many of these strategies. Whether you're using built-in methods or third-party modules, there's a solution out there for you. Just be mindful of the potential downsides and plan accordingly. Please share your favorite load-balancing strategy in the comments. Happy balancing!

Top comments (40)

Collapse
 
rkedlaya profile image
Raghavendra Kedlaya

The article is well-covered and addresses an essential topic.

While it primarily discusses load-balancing, it also serves as a failover mechanism.

I believe the concept of load-balancing goes beyond distributing traffic across web servers. Many front-end servers deliver UI elements, JavaScripts and assets. Modern browsers do leverage caching, powerfull front-end technologies like Angular, React etc provide data processing and deliver seamless user experience.

From a portal’s perspective, the heavier workload typically lies within the application’s middle-tier or API servers. These servers interface with databases, process requests, and prepare data tailored to user needs. They need to manage concurrency, heavy data operations, and resource competition efficiently.

In my experience, effective load-balancing can also involve functional segregation of API servers. For instance, I’ve implemented setups where multiple API server groups with identical functionality are isolated by functional criteria, such as separating data by State of the Country or business unit. User requests are routed to the appropriate API server group based on the user group, ensuring both better performance and logical isolation.

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @rkedlaya for your input. You are spot on—load balancing isn't just about spreading traffic across servers. It's also key for keeping things running smoothly and acting as a backup when needed.

I love your idea of splitting API servers based on function. It shows how load balancing can be customized to fit specific needs. By directing requests based on function, we boost performance and keep things organized, which is crucial for handling lots of data and resources efficiently. Maybe I can cover this in an article in the future.

Collapse
 
franklinthaker profile image
Franklin Thaker

very well written, thanks for sharing. very very helpful.

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @franklinthaker. I'm happy I could share what I learned, and I'm glad others are finding it helpful.

Collapse
 
franklinthaker profile image
Franklin Thaker

Keep posting <3

Collapse
 
thesohailjafri profile image
Sohail SJ | TheZenLabs

Easy and Great Read!

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @thesohailjafri

Collapse
 
ammar629 profile image
ammar629

As someone who is learning about backend development this was a great way to understand the load balancing concept Thank you

Collapse
 
juniourrau profile image
Ravin Rau

Thank you so much @ammar629 and all the best on your backend journey. I have learned a few things on the backend side that I will be sharing soon.

Collapse
 
ammar629 profile image
ammar629

I'm following you and looking forward to what you write next

Collapse
 
svijaykoushik profile image
Vijay Koushik, S. 👨🏽‍💻

Excellent job! Your concise and straightforward explanation helped me quickly grasp multiple load balancing approaches. Keep up the good work!

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @svijaykoushik, I keep it concise to make it easy for me to grasp the approaches from time to time. I am glad that it is useful for others too.

Collapse
 
sjhjane profile image
Ben

Good, Thank you so much!

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @sjhjane

Collapse
 
raman000 profile image
raman000

Good content .

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @raman000

Collapse
 
bawa_geek profile image
Lakh Bawa

Thanks for sharing

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @bawa_geek for reading it.

Collapse
 
indranil_kamulkar_62bc9a2 profile image
Indranil Kamulkar

Fantastic explanation, short and sweet and easy, helped me a lot

Collapse
 
juniourrau profile image
Ravin Rau

Thank you very much @indranil_kamulkar_62bc9a2. Glad that it helped you.

Collapse
 
uchechukwu_noble_28129eb5 profile image
Uchechukwu Noble • Edited

I'm new to backend using Django
How do I implement this in my projects
@juniourrau

Collapse
 
juniourrau profile image
Ravin Rau

Hi @uchechukwu_noble_28129eb5, honestly I haven't tried Django properly yet but based on my understanding here is what I have in mind.

Normally when you host/deploy a Django app you will need Gunicorn as the WSGI/Process Manager to manage the worker processes to communicate with your Django application. You can set nginx to handle the HTTP requests

Normal Setup

If you want to use nginx as a load balancer, it is when you have multiple servers hosting your Django application and you want to distribute your load around the server properly. You can set it like the article above based on your strategy.
Django Setup with Load Balancer

My recommendation is to first go with the normal setup and understand the interworking of how it works, then later on you can start experimenting with the load balancer once you understand how to deploy your application with docker.

Some comments have been hidden by the post's author - find out more