You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Summary:
Pull Request resolved: #999
# Context
Applying torch compile recursively on submodules (rather than once at the top-level module) is a common application, especially when targetting llama architectures where only the self attention layer(s) should be compiled.
# This Diff
Adds `recursive_module_types` flag to TorchCompileParams. Will recursively apply torch compile on any submodules matching the name
Reviewed By: galrotem
Differential Revision: D74410717
fbshipit-source-id: 319d15a109f132a216915d200bbdd04dd2c35871
0 commit comments