[ENH] Parameter Fitter: Basic implementation#6921
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #6921 +/- ##
==========================================
+ Coverage 88.37% 88.45% +0.07%
==========================================
Files 329 331 +2
Lines 72447 73063 +616
==========================================
+ Hits 64028 64625 +597
- Misses 8419 8438 +19 |
10bd677 to
42c9de4
Compare
janezd
left a comment
There was a problem hiding this comment.
I haven't really reviewed the PR - I haven't even checked it out from the repo. Here are just two things I incidentally noticed. Feel free to ignore.
Orange/base.py
Outdated
| ("type", Type), | ||
| ("min", Union[int, NoneType]), | ||
| ("max", Union[int, NoneType]), | ||
| ] |
There was a problem hiding this comment.
What I had in mind was
class FittedParameter(NamedTuple):
name: str
label: str
tick_label: str = ""
type_: type = int
min: Optional[int] = None
max: Optional[int] = NoneNo need to change it; I find this easier on the eye, but it's a matter of taste.
| fitted_parameter_props: Learner.FittedParameter, | ||
| initial_parameters: dict, | ||
| steps: Sized, | ||
| initial_parameters: dict[str, Any], |
There was a problem hiding this comment.
Currently, we'd have dict[str, int], but eventually we'd have dict[str, Union[int, float]]. On the other hand, what if the learner provides a list of string arguments? So I put dict[str, Any]. We can put a humble int there, though, for the time being.
There was a problem hiding this comment.
I'd put an int just to emphasize that other types are not supported and should be carefully reconsidered.
| @property | ||
| def initial_parameters(self) -> dict: | ||
| if not self._learner or not self._data: | ||
| if not self._learner: |
6b825d3 to
4e862e5
Compare
|
I focused on the gui layout and changed a few minor things I encountered. In the future, we probably want to support:
So, I thought about making some other changes, but decided not to, because they would probably be changed again in that future. |
| fitted_parameter_props: Learner.FittedParameter, | ||
| initial_parameters: dict, | ||
| steps: Sized, | ||
| initial_parameters: dict[str, Any], |
There was a problem hiding this comment.
I'd put an int just to emphasize that other types are not supported and should be carefully reconsidered.
| initial_parameters: dict, | ||
| steps: Sized, | ||
| initial_parameters: dict[str, Any], | ||
| steps: Collection[Any], |
| initial_parameters: dict, | ||
| steps: Sized, | ||
| initial_parameters: dict[str, Any], | ||
| steps: Collection[Any], |
| rect.adjust(style.pixelMetric(style.PM_IndicatorWidth) | ||
| + style.pixelMetric(style.PM_CheckBoxLabelSpacing), 0, 0, 0) | ||
|
|
||
| last_text = f", {self.__steps[-1]}" |
|
I think I've addressed all comments except for
This looks like a problem in |
d4a1145 to
ae4ee9e
Compare
|
You likely noticed this already, but this PR is not Python 3.9 compatible. Or are we planning to drop support for 3.9? Numpy already did according to https://numpy.org/neps/nep-0029-deprecation_policy.html. |
c702f4d to
0ca5f2a
Compare
0ca5f2a to
b30ddcb
Compare
|
I added rudimentary documentation. The widget is bound to significantly change in the future, so it's not worth spending too much time on this. |
lanzagar
left a comment
There was a problem hiding this comment.
- The widget crashes when given a dataset without a class.
- I think the controls in Parameter Fitter should be tied to the Learner input. One should be able to select a parameter to fit and the range of values before connecting the data input (since that already starts the fitting and I would like it to use the correct settings). Also, refreshing the data or even inputting a completely different dataset should not reset the settings in Parameter Fitter. I think it is acceptable if they are reset when the Learner input is refreshed, but ideally it would even try to keep the settings when e.g. I just change the name of the learner or some of the other learner parameters (maybe we don't reset to default if the class of the learner stays the same?).
25bd989 to
24638c0
Compare
| def set_learner(self, learner: Optional[Learner]): | ||
| if self._learner: | ||
| self.__initialize_settings = \ | ||
| not isinstance(self._learner, type(learner)) |
There was a problem hiding this comment.
If you need an exact type, do not use isinstance (this also allows for subclasses).
I would do it differently, though, by just comparing if these two leartners have the same fitted_parameters.
24638c0 to
56b705f
Compare
56b705f to
932fe14
Compare


Issue
Implement a Parameter Fitter widget:
fitted_parametersmethod.Includes