You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on May 20, 2025. It is now read-only.
Copy file name to clipboardExpand all lines: docs/guides/python/ai-podcast-part-1.mdx
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -478,9 +478,9 @@ You should get a similiar result to before. The main difference is that the mode
478
478
479
479
## Defining our service docker images
480
480
481
-
So that the AI workload can use GPUs in the cloud we'll need to make sure it ships with drivers and libraries to support that. We can do this by specifying a custom Dockerfile for our batch service under `docker/torch.dockerfile`.
481
+
So that the AI workload can use GPUs in the cloud we'll need to make sure it ships with drivers and libraries to support that. We can do this by specifying a custom Dockerfile for our batch service under `torch.dockerfile`.
482
482
483
-
```dockerfile title: docker/torch.dockerfile
483
+
```dockerfile title: torch.dockerfile
484
484
# The python version must match the version in .python-version
485
485
FROM ghcr.io/astral-sh/uv:python3.11-bookworm-slim AS builder
486
486
@@ -549,7 +549,7 @@ ENTRYPOINT python -u $HANDLER
549
549
550
550
We'll also add a dockerignore file to try and keep the image size down.
0 commit comments