-
Notifications
You must be signed in to change notification settings - Fork 71
Add Oncoclear project focused on FastAPI logic #194
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
| else: | ||
| logger.info(f"Model promoted to {stage}!") | ||
| is_promoted = True | ||
|
|
||
| # Get the model in the current context | ||
| current_model = get_step_context().model | ||
|
|
||
| # Get the model that is in the production stage | ||
| client = Client() | ||
| try: | ||
| stage_model = client.get_model_version( | ||
| current_model.name, stage | ||
| ) | ||
| # We compare their metrics | ||
| prod_accuracy = ( | ||
| stage_model.get_artifact("sklearn_classifier") | ||
| .run_metadata["test_accuracy"] | ||
| ) | ||
| if float(accuracy) > float(prod_accuracy): | ||
| # If current model has better metrics, we promote it | ||
| is_promoted = True | ||
| current_model.set_stage(stage, force=True) | ||
| except KeyError: | ||
| # If no such model exists, current one is promoted | ||
| is_promoted = True | ||
| current_model.set_stage(stage, force=True) | ||
| return is_promoted |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nitpick comment: you're already reassigning is_promoted to True at the beginning of the else block, so the identical reassignments below are redundant.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I actually like it that way to make it explcit
oncoclear/steps/model_trainer.py
Outdated
| model_type: str = "sgd", | ||
| target: Optional[str] = "target", | ||
| ) -> Annotated[ | ||
| ClassifierMixin, ArtifactConfig(name="sklearn_classifier", is_model_artifact=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
since artifact_type will be deprecated soon, you can update the function's return type to use the new annotation style
Annotated[
ClassifierMixin, ArtifactConfig(name="sklearn_classifier", artifact_type=ArtifactType.MODEL)
]:
marwan37
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Other than the 2 minor comments, everything else looks great 👍
Just need to add images now
How to Run:
Detailed instructions are available in the
README.md, covering setup, dependency installation, ZenML integration installation, and commands to execute each pipeline.Goal:
To establish the foundational codebase for the
oncoclearproject, providing a robust example of a ZenML-powered MLOps workflow for a standard classification task.