Managed Kubernetes Load Balancer on Public Cloud
Load Balancer for Managed Kubernetes Service
Ensure seamless application performance by intelligently distributing incoming traffic across multiple servers or nodes. Our real-time load balancing solution helps you efficiently manage traffic surges, maintain high availability, and deliver secure, reliable experiences for users, no matter the demand.
Reserved for Managed Kubernetes Service users
Why choose a Load Balancer for Managed Kubernetes Service?
99.99% availability
Ensure maximum uptime and performance with a load-balancing solution built on a distributed architecture, designed for resilience and reliability.
Automated node management
Stay operational even during failures. When a node experiences issues, it’s automatically removed from the load pool, making outage handling and maintenance seamless.
Directly integrated into Kubernetes
Use case examples
Web applications with high-traffic
Is your application generating high volumes of traffic, with an increasing number of visits? With the OVHcloud Load Balancer solution, you can effortlessly manage this growth by adding new nodes to your configuration in just a few clicks.
Variable or seasonal activity
News and e-commerce websites can have sudden and significant changes in traffic. Whether it is increasing or decreasing, the Load Balancer will adapt how it distributes traffic.
Upgrades applied with no interruptions
With the rolling upgrade system, your application nodes are upgraded without downtime, thanks to the responsive Load Balancer service. Nodes will be automatically added and removed.
You may also like
Your questions answered
What is load balancing in the cloud?
Load balancing is an operation that distributes the workload among several elements capable of performing the required task. In the cloud, load balancing is most often used for network connections that correspond to the load. These network connections are also known as service requests. Cloud load balancers offer scalability, reliability, and automation by efficiently managing spikes in traffic and distributing workloads across multiple nodes.
How does a load balancer work?
Load balancing follows rules set up by the operator. A flat or weighted distribution is most often selected when dealing with network connections alone. For example, when considering an application distribution, you can choose one according to routing rules depending on the content served, or user identification.
What is a load balancer?
The purpose of a load balancer is to distribute the workload between different servers or applications. It can be set up on both physical and virtual infrastructure. A load balancing software takes the form of an application delivery controller (ADC), and you can use it to scale the workload automatically depending on traffic forecasts. In real-time, the ADC identifies which server or application is best suited to meet a request, so that the cluster maintains a stable level of performance. In the event of an outage, it will also be responsible for redirecting traffic to a resource capable of handling it. Several configuration types are available.
In the timeline, the load balancer intervenes between the user and the server. It analyses the request, determines which machine is available to respond to it, and then forwards it to the machine in question. It can also add servers as required.
Note that load balancing is just one of several potential uses of the load balancer. It can also be used to unclog an SSL certificate or update application groups. You can even use them to route your domain names.
How does the load balancer work with Kubernetes?
When you start using Kubernetes for your applications, the issue of external traffic is an important factor to consider. This topic is briefly discussed on the official Kubernetes website, but we will provide some details.
Firstly, there are several ways of routing external traffic to your cluster:
- using a proxy with the ClusterIP;
- defining a service as a NodePort;
- declaring a service as a load balancer, and exposing it to external traffic (the most common method).
END OF LIFE (EoL) ALERT: The Load Balancer for Managed Kubernetes Service is at End of Sale (EoS) and End of Support (EoS). Its End of Life (EoL) is scheduled for July 2026. You must now use the new Public Cloud Load Balancer for all your projects. More information here.


