Skip to content

Commit 6aa3f2b

Browse files
authored
fix(gpu): fix broken link to H100 SXM (#5519)
1 parent 05f7eee commit 6aa3f2b

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

pages/gpu/reference-content/choosing-gpu-instance-type.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ Remember that there is no one-size-fits-all answer, and the right GPU Instance t
5656
| Better used for | Image / Video encoding (4K) | 7B LLM Fine-Tuning / Inference | 70B LLM Fine-Tuning / Inference |
5757
| What they are not made for | Large models (especially LLM) | Graphic or video encoding use cases | Graphic or video encoding use cases |
5858

59-
| | **[H100-SXM-2-80G](https://www.scaleway.com/en/)** | **[H100-SXM-4-80G](https://www.scaleway.com/en/)** | **[H100-SXM-8-80G](https://www.scaleway.com/en/)** |
59+
| | **[H100-SXM-2-80G](https://www.scaleway.com/en/h100-pcie-try-it-now/)** | **[H100-SXM-4-80G](https://www.scaleway.com/en/h100-pcie-try-it-now/)** | **[H100-SXM-8-80G](https://www.scaleway.com/en/h100-pcie-try-it-now/)** |
6060
|--------------------------------------------------------------------|-------------------------------------------------------------------|-------------------------------------------------------------------|-------------------------------------------------------------------|
6161
| GPU Type | 2x [H100-SXM](https://www.nvidia.com/en-us/data-center/h100/) SXM | 4x [H100-SXM](https://www.nvidia.com/en-us/data-center/h100/) SXM | 8x [H100-SXM](https://www.nvidia.com/en-us/data-center/h100/) SXM |
6262
| NVIDIA architecture | Hopper 2022 | Hopper 2022 | Hopper 2022 |

0 commit comments

Comments
 (0)