Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: disable HPA downscaling by default #58

Merged
merged 1 commit into from
Jun 20, 2024
Merged

Conversation

pedroapero
Copy link
Contributor

@pedroapero pedroapero commented Jun 3, 2024

Motivation

The default downtime minimum replicas is 0; and the minReplicas field can't be 0 in HorizontalPodAutscaler.
Hence the default installation crashes when there are HorizontalPodAutoscalers with unspecified downscaler/downtime-replicas annotation.

Changes

Disable the downscaling of HorizontalPodAutoscalers by default (it cannot work by default as it requires manual annotations to be functional).

Tests done

  1. deploy an HPA with no annotation
  2. deploy py-kube-downscaler with default Helm variables
  3. verify that py-kube-downscaler does not crash when downscaling happens

The default downtime minimum replicas is 0; and the minReplicas field
can't be 0 in HorizontalPodAutscaler.

This fixes the default installation from crashing when there are
HorizontalPodAutoscalers with unspecified downscaler/downtime-replicas
annotation.
@pedroapero
Copy link
Contributor Author

pedroapero commented Jun 4, 2024

Note: an HPA does not scale Deployments with minReplicas=0 (so even with HPAs; all Deployments will still be downscaled by default).

Rebased to include the cobertura workflow fix.

Copy link
Member

@eumel8 eumel8 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@JTaeuber JTaeuber merged commit 7448852 into caas-team:main Jun 20, 2024
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants