-
Notifications
You must be signed in to change notification settings - Fork 146
Respect predefined modes in get_default_mode
#1166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Respect predefined modes in get_default_mode
#1166
Conversation
70eec3c
to
3aa4a4d
Compare
4bb1be8
to
837f98e
Compare
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #1166 +/- ##
=======================================
Coverage 82.27% 82.27%
=======================================
Files 186 186
Lines 48004 48000 -4
Branches 8624 8621 -3
=======================================
- Hits 39493 39490 -3
- Misses 6352 6353 +1
+ Partials 2159 2157 -2
🚀 New features to boost your workflow:
|
837f98e
to
530ddf6
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
left small comments, but looks great
pytensor/compile/mode.py
Outdated
else: | ||
default_mode_class = string | ||
default_mode_class = string | ||
# FIXME: This is flawed, we should use proper object comparison. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What needs to happen for this to happen? We need a __hash__
method in the mode class?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure, I think I would just stop caching the debug modes and avoid the concern
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah we would need to check the class matches
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is an isinstance
enough?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's a string to Class match. I'll think about it but get the fix merged already
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It can be merged as-is IMO I'm just asking : )
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am always worried about rabbit holes when I touch some of these arcane parts of the codebase. It makes complete sense to question them!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cleaned it up by just caching the 3 possible modes and avoiding the silly check. Please re-review
Also allow arbitrary capitalization of the modes. Also make linker and optimizer non-mutable config as the mode is cached after using them for the first time.
530ddf6
to
ff68d22
Compare
Also make linker and optimizer non-mutable config as the mode is cached after using them for the first time.
Due to some silly string based logic for the comparison, pytensor was not respecting config.mode changes for
NUMBA
/JAX
/PYTORCH
, since these linkers class name isMode
just like the default.📚 Documentation preview 📚: https://pytensor--1166.org.readthedocs.build/en/1166/