Skip to content

Rename storageAutoscaling to storageScaling in values and templates#813

Merged
jvpasinatto merged 4 commits intopercona:mainfrom
amerello:patch-1
Mar 9, 2026
Merged

Rename storageAutoscaling to storageScaling in values and templates#813
jvpasinatto merged 4 commits intopercona:mainfrom
amerello:patch-1

Conversation

@amerello
Copy link
Contributor

@amerello amerello commented Mar 9, 2026

The psmdb-db chart was rendering the field storageAutoscaling in the PerconaServerMongoDB custom resource, but the operator expects storageScaling. This caused autoscaling configurations to be ignored.
Solution:
Updated charts/psmdb-db/templates/cluster.yaml to use the correct storageScaling field in the rendered output.
Renamed .Values.storageAutoscaling to .Values.storageScaling in values.yaml and the template for consistency.

@it-percona-cla
Copy link

it-percona-cla commented Mar 9, 2026

CLA assistant check
All committers have signed the CLA.

@amerello
Copy link
Contributor Author

amerello commented Mar 9, 2026

This should fix #812

@egegunes
Copy link
Contributor

egegunes commented Mar 9, 2026

@amerello could you please update README as well?

@amerello
Copy link
Contributor Author

amerello commented Mar 9, 2026

@amerello could you please update README as well?

@egegunes thank you for your feedback. I had missed that.

@jvpasinatto
Copy link
Contributor

thanks @amerello. Could you also bump the field version in Chart.yaml in to 1.22.1? You can keep appVersion as it is.

@jvpasinatto jvpasinatto merged commit 1aa5e0b into percona:main Mar 9, 2026
2 checks passed
@jvpasinatto
Copy link
Contributor

Thanks for the contribution @amerello!

@amerello amerello deleted the patch-1 branch March 9, 2026 11:45
@amerello
Copy link
Contributor Author

amerello commented Mar 9, 2026

Thanks for the contribution @amerello!

Thank you for the quick review!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants