-
Notifications
You must be signed in to change notification settings - Fork 7
Update the FAQ docs. #1003
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update the FAQ docs. #1003
Conversation
jspeerless
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Some minor changes - but otherwise LGTM!
|
|
||
| evaluators = project.predictor_evaluations.default(predictor_id=predictor.uid) | ||
|
|
||
| Even if the predictor hasn't been registered to platform yet: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| Even if the predictor hasn't been registered to platform yet: | |
| You can evaluate your predictor even if it hasn't been registered to platform yet: |
| Working With Evaluators | ||
| ======================= | ||
|
|
||
| You can also run your predictor against a list of specific evaluators: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think maybe we should just add a note that Evaluators (such as CrossValidationEvaluators) can be configured the same as they always have. The only difference is that they can be executed directly against a predictor to get evaluation results without the need for a registered Predictor Evaluation Workflow.
518e98e to
9dd3074
Compare
Drop the data manager and v3.0 migration docs, as they shouldn't be needed any more. Adds a FAQ to clearly demonstrate the usage of predictor_evaluators.
9dd3074 to
5f24de8
Compare
jspeerless
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks Great!
kroenlein
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What's written looks good.
The Example of working with the AI Engine still references
workflow = PredictorEvaluationWorkflow(
name='workflow that evaluates y',
evaluators=[evaluator]
)rather than the new workflow. But maybe that's a different PR.
Good catch! I'll knock that out in a quick follow-up, so James can release earlier if he wishes. |
Never mind. Those docs are only published on release, and I've already made those (and many more) docs changes in another PR. So we're all set. |
Drop the data manager and v3.0 migration docs, as they shouldn't be needed any more. Adds a FAQ to clearly demonstrate the usage of predictor_evaluators.
PR Type:
Adherence to team decisions