-
Notifications
You must be signed in to change notification settings - Fork 60
Activate FP8 cases in test_ops #2144
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
This PR activates FP8 (8-bit floating point) test cases for XPU backend by removing the global FP8 skip pattern from the test skip list. With FP8 support now available on XPU, the tests that were previously skipped due to lack of FP8 implementation can now be enabled.
Key changes:
- Removes the "float8" skip pattern that was blocking all FP8-related tests
- Removes associated comments explaining why FP8 was not supported
- Enables previously skipped FP8 test cases including
_scaled_mm
and_refs_eye
operations
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
@daisyden Please activate these cases that do not work on XPU backend in stock Pytorch. |
@CuiYifeng : FP8 is already supported for the scope of activated tests. Or it will be supported soon? If the first, then we should expect FP8 tests to execute without failures. If the latter we should see test failures. I guess the first is true as I don't see FP8 specific test failures in the summary. |
@dvrogozh Thanks for the reminder. The first interpretation is closer to the facts. I have updated "FP8 will be supported" to "more FP8 Ops will be supported". |
4265674
to
2c1dddd
Compare
2c1dddd
to
07ffc3e
Compare
Since more FP8 Ops will be supported on XPU recently, basic FP8 cases should be activated. This PR will remove the following cases from skip list: ``` TestCommonXPU::test_compare_cpu_torch__scaled_mm_xpu_float8_e4m3fn SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_out_torch__scaled_mm_xpu_float8_e4m3fn SKIPPED (Skipped!) TestCommonXPU::test_python_ref__refs_eye_xpu_float8_e4m3fn SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref__refs_eye_xpu_float8_e4m3fnuz SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref__refs_eye_xpu_float8_e5m2 SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref__refs_eye_xpu_float8_e5m2fnuz SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_executor__refs_eye_executor_aten_xpu_float8_e4m3fn PASSED TestCommonXPU::test_python_ref_executor__refs_eye_executor_aten_xpu_float8_e4m3fnuz PASSED TestCommonXPU::test_python_ref_executor__refs_eye_executor_aten_xpu_float8_e5m2 PASSED TestCommonXPU::test_python_ref_executor__refs_eye_executor_aten_xpu_float8_e5m2fnuz PASSED TestCommonXPU::test_python_ref_meta__refs_eye_xpu_float8_e4m3fn SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_meta__refs_eye_xpu_float8_e4m3fnuz SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_meta__refs_eye_xpu_float8_e5m2 SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_meta__refs_eye_xpu_float8_e5m2fnuz SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_torch_fallback__refs_eye_xpu_float8_e4m3fn SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_torch_fallback__refs_eye_xpu_float8_e4m3fnuz SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_torch_fallback__refs_eye_xpu_float8_e5m2 SKIPPED (test doesn't work on XPU backend) TestCommonXPU::test_python_ref_torch_fallback__refs_eye_xpu_float8_e5m2fnuz SKIPPED (test doesn't work on XPU backend) ```
Since more FP8 Ops will be supported on XPU recently, basic FP8 cases should be activated. This PR will remove the following cases from skip list: