You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Operator {op.__module__}.{op.__name__} is not in Core ATen opset (https://pytorch.org/docs/stable/torch.compiler_ir.html#core-aten-ir)."
126
126
There are a few things to try:
127
127
1. You can proceed with `to_edge(compile_config=EdgeCompileConfig(_core_aten_ops_exception_list=[torch.ops.{str(op)}]))`.
128
-
Please make sure that the backends you are planning to lower to is able to handle {str(op)}, or you have a corresponding kernel linked to your runtime.
128
+
Please make sure that the backend(s) you are planning to lower to is able to handle {str(op)}, or you have a corresponding kernel linked to your runtime.
129
129
130
-
2. Sometimes inference and training gives slightly different ops. Try adding `with torch.no_grad():` context manager if you don't care about training.
130
+
2. Sometimes inference and training gives slightly different op set. Try adding `with torch.no_grad():` context manager if you are export for inference only.
131
131
132
-
3. If the error persists after 2, this is likely caused by torch.export() + core ATen decomposition produce unexpected operator set on your model.
132
+
3. If the error persists after 2, this is likely caused by torch.export() + core ATen decomposition producing unexpected operators for your model.
133
133
If you believe this operator should be included into core ATen opset, please create an issue in https://github.com/pytorch/pytorch/issues and add `module: core aten` tag.
0 commit comments