
KEDA | Kubernetes Event-driven Autoscaling
With KEDA, you can drive the scaling of any container in Kubernetes based on the number of events needing to be processed. KEDA is a single-purpose and lightweight component that …
KEDA Concepts
What is KEDA? KEDA is a tool that helps Kubernetes scale applications based on real-world events. It was created by Microsoft and Red Hat. With KEDA, you can adjust the size of your …
KEDA | Getting Started
Welcome to the documentation for KEDA, the Kubernetes Event-driven Autoscaler. Use the navigation bar on the left to learn more about KEDA’s architecture and how to deploy and use …
Deploying KEDA
This command installs KEDA in a dedicated namespace (keda). You can customize the installation by passing additional configuration values with --set, allowing you to adjust …
Scaling Deployments, StatefulSets & Custom Resources - KEDA
With KEDA you can scale any workload defined as any Custom Resource (for example ArgoRollout resource). The scaling behaves the same way as scaling for arbitrary Kubernetes …
Scalers - KEDA
KEDA External Scaler that can obtain metrics from OTel collector and use them for autoscaling.
Scaling Jobs - KEDA
It can be useful to instruct KEDA to pause the autoscaling of objects, to do to cluster maintenance or to avoid resource starvation by removing non-mission-critical workloads.
Setup Autoscaling with KEDA
Follow the KEDA installation guide carefully, including any prerequisites specific to your Kubernetes setup. The installation guide provides instructions for different installation methods …
AWS SQS Queue - KEDA
When identityOwner set to operator - the only requirement is that the KEDA operator has the correct IAM permissions on the SQS queue. Additional Authentication Parameters are not …
Metrics API - KEDA
When the metric provided by the API is equal or higher to this value, KEDA will start scaling out. When the metric is 0 or less, KEDA will scale down to 0. (This value can be a float) …