Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Actor and Provider Daemonscalers #157

Merged
merged 2 commits into from
Sep 5, 2023
Merged

Conversation

brooksmtownsend
Copy link
Member

@brooksmtownsend brooksmtownsend commented Aug 14, 2023

Feature or Problem

This PR adds a new scaler type, the daemonscaler, which is a drop-in replacement for the spreadscaler trait. Instead of spreading a certain number of replicas across a specified set of constraints, the daemonscaler spreads a certain number of replicas across all hosts in a lattice that match the requirements of the spread.

Kubernetes developers can think of a Spreadscaler as a k8s deployment with label selectors, and the Daemonscaler as a k8s DaemonSet (with optional label selectors)

More documentation will be needed here (PR forthcoming), but just so you know:

  1. The daemonscaler uses the exact same config as the spreadscaler
  2. Both daemonscalers ignore weight entirely
  3. The provider spreadscaler ignores the replicas field, as you can only run one copy of a provider per host.

Related Issues

N/A

Release Information

v0.6.0

Consumer Impact

Testing

Built on platform(s)

  • x86_64-linux
  • aarch64-linux
  • x86_64-darwin
  • aarch64-darwin
  • x86_64-windows

Tested on platform(s)

  • x86_64-linux
  • aarch64-linux
  • x86_64-darwin
  • aarch64-darwin
  • x86_64-windows

Unit Test(s)

Added unit tests for the actor daemonscaler to ensure spreading logic works

Acceptance or Integration

Manual Verification

I manually tested this working with and without constraints.

@brooksmtownsend
Copy link
Member Author

brooksmtownsend commented Aug 22, 2023

  • Note to self: check to make sure this actually works when no spreads are defined

@brooksmtownsend brooksmtownsend changed the title wip: daemonscaler impl Add Actor and Provider Daemonscalers Sep 1, 2023
@brooksmtownsend brooksmtownsend marked this pull request as ready for review September 1, 2023 18:30
Copy link
Contributor

@thomastaylor312 thomastaylor312 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just one minor question around tests. Others are nits

src/scaler/daemonscaler/mod.rs Outdated Show resolved Hide resolved
src/scaler/daemonscaler/mod.rs Show resolved Hide resolved
src/scaler/daemonscaler/provider.rs Show resolved Hide resolved
thomastaylor312
thomastaylor312 previously approved these changes Sep 5, 2023
@brooksmtownsend brooksmtownsend force-pushed the feat/daemonscaler branch 2 times, most recently from 1b6e3c0 to 17baae7 Compare September 5, 2023 20:17
@brooksmtownsend brooksmtownsend changed the base branch from main to fix/shared-provider-usage September 5, 2023 20:18
Base automatically changed from fix/shared-provider-usage to main September 5, 2023 20:31
Signed-off-by: Brooks Townsend <brooks@cosmonic.com>

refactored for correct logic

Signed-off-by: Brooks Townsend <brooks@cosmonic.com>

fixed issue with no spread requirements

Signed-off-by: Brooks Townsend <brooks@cosmonic.com>
thomastaylor312
thomastaylor312 previously approved these changes Sep 5, 2023
Signed-off-by: Brooks Townsend <brooks@cosmonic.com>

cleanup

Signed-off-by: Brooks Townsend <brooks@cosmonic.com>

allow cleanup for provider

Signed-off-by: Brooks Townsend <brooks@cosmonic.com>

added test, fix pr comment

Signed-off-by: Brooks Townsend <brooks@cosmonic.com>

removed wait for washboard

Signed-off-by: Brooks Townsend <brooks@cosmonic.com>

so no head?

Signed-off-by: Brooks Townsend <brooks@cosmonic.com>
@brooksmtownsend brooksmtownsend merged commit c2b189a into main Sep 5, 2023
5 checks passed
@brooksmtownsend brooksmtownsend deleted the feat/daemonscaler branch September 5, 2023 21:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants