Skip to content

Conversation

@till-m
Copy link
Member

@till-m till-m commented Nov 1, 2024

resolves #93 #191 #308 #376

Implement typed optimization as described in ref. For an example, see the notebook and also examples/typed_hyperparameter_tuning.py. Some evidence that this can be valuable:
discrete_vs_continuous

Copy link
Collaborator

@bwheelz36 bwheelz36 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great work! this is very cool

  • minor: in the new notebook, might be nice to add type hints, particularly to show that target_function_1d(x) -> int

@till-m
Copy link
Member Author

till-m commented Nov 24, 2024

Thanks for the review! Once I have some time to finish the custom parameter example, this would be ready for merging.

NB: I might also handle #530 with this PR, as it's technically already implemented here.

@till-m
Copy link
Member Author

till-m commented Dec 10, 2024

@bwheelz36 I added an example with a custom parameter, if you get the chance it'd be great if you could have a look

@till-m
Copy link
Member Author

till-m commented Dec 10, 2024

@phi-friday could you have another look over this PR and check if the typing is implemented correctly?

@till-m
Copy link
Member Author

till-m commented Dec 23, 2024

@bwheelz36 @fmfn

I think from my side this is done. I'm looking to merge this pretty soon and make a release after. Apparently you can do beta releases which might make sense here.

@till-m till-m merged commit 0ef608f into bayesian-optimization:master Dec 27, 2024
13 checks passed
@till-m till-m deleted the parameter-types branch May 21, 2025 09:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Support different data types for optimization parameters

3 participants