Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/SK-1081 | Use stores in Combiner + ModelPredict #718

Open
wants to merge 18 commits into
base: master
Choose a base branch
from

Conversation

Wrede
Copy link
Member

@Wrede Wrede commented Oct 2, 2024

This pull request includes extensive changes to refactor the terminology from "inference" to "prediction" across the codebase, along starting using stores instead of mongostatestore.py in combiner. Adds ModelPredict message and SendModelPrediction method in proto to allow for Inference workflows. Below are the most important changes:

Breaking change:
endpoint /api/v1/infer -> /api/v1/predict

Terminology Refactor (Inference to Prediction)

API Changes

Minor Improvements and Bug Fixes

These changes ensure consistency in terminology and improve the clarity and functionality of the codebase.

@github-actions github-actions bot added feature New feature or request minor labels Oct 2, 2024
Copy link
Member

@ahellander ahellander left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good! But should we not call them "ModelPrediction" ?

@Wrede
Copy link
Member Author

Wrede commented Oct 9, 2024

Looks good! But should we not call them "ModelPrediction" ?

hmm yes maybe

@Wrede Wrede changed the title Feature/SK-1081 | Use stores in Combiner + ModelInference Feature/SK-1081 | Use stores in Combiner + ModelPredict Oct 11, 2024
Copy link
Member Author

@Wrede Wrede left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

had some comments

def _process_inference_request(self, model_id: str, session_id: str, presigned_url: str):
"""Process an inference request.
def _process_prediction_request(self, model_id: str, session_id: str, presigned_url: str):
"""Process an prediction request.
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"a" predtiction

@@ -7,7 +7,7 @@ class ClientState(Enum):
idle = 1
training = 2
validating = 3
inferencing = 4
predicting = 4
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can also add the ClientStateToString for "predicting"


try:
_ = self.combinerStub.SendModelPrediction(prediction, metadata=self.metadata)
status_type = fedn.StatusType.INFERENCE
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can change StatusType.INFERENCE to StatusType.MODEL_PREDICTION in fedn.proto

:type model_id: str
:param config: the model configuration to send to clients
:type config: dict
:param clients: the clients to send the request to
:type clients: list

"""
clients = self._send_request_type(fedn.StatusType.INFERENCE, session_id, model_id, clients)
clients = self._send_request_type(fedn.StatusType.INFERENCE, prediction_id, model_id, clients)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

see other comment about StatusType

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New feature or request github minor
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants