
Manage variable traffic loads on your application
Access reserved for users of the Managed Kubernetes® Service.
As your business grows and your application experiences more varied traffic, it is vital to maintain the same level of service. This is why cloud applications are usually built on distributed architectures that are spread out. They are more robust, and can easily handle peak loads. With our Load Balancer, you may securely and automatically balance your application’s load in real time, across several nodes.
99.99% availability
The OVHcloud Load Balancer is designed to deliver a high level of availability and resilience, and is also based on its own distributed architecture.
Automated node management
If a node stops working properly, it is automatically removed from the list of available nodes for balancing. This means you can manage maintenance operations to encourage the prevention of downtime.
Directly integrated into Kubernetes
The Load Balancer delivers an interface that is directly compatible with Kubernetes. This means you can control your Load Balancer, with native tools.
Use Case
Web applications with high volumes of traffic
Is your application generating high volumes of traffic, with an increasing number of visits? With the OVHcloud Load Balancer solution, you can manage this growth by adding new nodes to your configuration in just a few clicks.
Variable or seasonal activity
Information websites and online stores can experience very quick variations in traffic volume. Whether it is increasing or decreasing, the Load Balancer will adapt how it distributes traffic.
Upgrades applied with no interruptions
With the rolling upgrade system, your application nodes may be upgraded without any interruptions, because of the Load Balancer service's reactivity. Nodes will be automatically added and removed.
Specifications
Our Load Balancer solution is constantly being developed. Currently, the service is working with the following limits.
TCP | 10,000 connections |
HTTP | 2,000 req/s |
Bandwidth | 200 Mbps |
We will soon be able to offer more flexibility, and resources to suit greater requirements.
Usage
For Kubernetes:
Create a Load Balancer
kubectl -f apply load_balancer.yaml
Delete a Load Balancer
kubectl delete service load-balancer
Features
Expedited interaction
Create a Load Balancer in minutes, and update it almost instantly. This means you can be well-prepared for managing traffic spikes.
Kubernetes interface
Create and manage your Load Balancer directly via Kubernetes.
Proxy protocol
To retain the sender’s initial address, the Load Balancer integrates Proxy Protocol. This means you can perform essential actions on the nodes such as IP address filtering, generating statistics, and analyzing logs.
IP address filtering
You can choose a filtering access policy by default, and provide a restricted list of IP addresses that can connect to your solution.
Private network connections
To keep your application nodes isolated on the private network, the Load Balancer can be used as a pathway between public addressing and your private networks, with the OVHcloud vRack.

Other products
What is load balancing in the cloud?
Load balancing is an operation that distributes the workload among several elements capable of performing the required task. In the cloud, load balancing is most often used for network connections that correspond to the load. These network connections are also known as service requests.
How does a load balancer work?
Load balancing follows rules set up by the operator. A flat or weighted distribution is most often selected when dealing with network connections alone. For example, when considering an application distribution, you can choose one according to routing rules depending on the content served, or user identification.