You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: src/content/docs/aws/services/sagemaker.md
+25-26Lines changed: 25 additions & 26 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,6 +1,5 @@
1
1
---
2
2
title: "SageMaker"
3
-
linkTitle: "SageMaker"
4
3
description: Get started with SageMaker on LocalStack
5
4
tags: ["Ultimate"]
6
5
---
@@ -11,13 +10,13 @@ Amazon SageMaker is a fully managed service provided by Amazon Web Services (AWS
11
10
It streamlines the machine learning development process, reduces the time and effort required to build and deploy models, and offers the scalability and flexibility needed for large-scale machine learning projects in the AWS cloud.
12
11
13
12
LocalStack provides a local version of the SageMaker API, which allows running jobs to create machine learning models (e.g., using PyTorch) and to deploy them.
14
-
The supported APIs are available on our [API coverage page]({{< ref "coverage_sagemaker" >}}), which provides information on the extent of Sagemaker's integration with LocalStack.
13
+
The supported APIs are available on our [API coverage page](), which provides information on the extent of Sagemaker's integration with LocalStack.
15
14
16
-
{{< callout >}}
15
+
:::note
17
16
LocalStack supports custom-built models in SageMaker.
18
17
You can push your Docker image to LocalStack's Elastic Container Registry (ECR) and use it in SageMaker.
19
18
LocalStack will use the local ECR image to create a SageMaker model.
20
-
{{< /callout >}}
19
+
:::
21
20
22
21
## Getting started
23
22
@@ -29,46 +28,46 @@ We will demonstrate an application illustrating running a machine learning job u
29
28
- Creates a SageMaker Endpoint for accessing the model
30
29
- Invokes the endpoint directly on the container via Boto3
31
30
32
-
{{< callout >}}
31
+
:::note
33
32
SageMaker is a fairly comprehensive API for now.
34
33
Currently a subset of the functionality is provided locally, but new features are being added on a regular basis.
35
-
{{< /callout >}}
34
+
:::
36
35
37
36
### Download the sample application
38
37
39
38
You can download the sample application from [GitHub](https://github.com/localstack/localstack-pro-samples/tree/master/sagemaker-inference) or by running the following commands:
40
39
41
-
{{< command >}}
42
-
$ mkdir localstack-samples && cd localstack-samples
Start your LocalStack container using your preferred method.
67
66
Run the sample application by executing the following command:
68
67
69
-
{{< command >}}
70
-
$ python3 main.,py
71
-
{{< /command >}}
68
+
```bash
69
+
python3 main.py
70
+
```
72
71
73
72
You should see the following output:
74
73
@@ -92,19 +91,19 @@ You can also invoke a serverless endpoint, by navigating to `main.py` and uncomm
92
91
93
92
## Resource Browser
94
93
95
-
The LocalStack Web Application provides a [Resource Browser]({{< ref "resource-browser" >}}) for managing Lambda resources.
94
+
The LocalStack Web Application provides a [Resource Browser](/aws/capabilities/web-app/resource-browser) for managing Sagemaker resources.
96
95
You can access the Resource Browser by opening the LocalStack Web Application in your browser, navigating to the **Resources** section, and then clicking on **Sagemaker** under the **Compute** section.
97
96
98
97
The Resource Browser displays Models, Endpoint Configurations and Endpoint.
99
98
You can click on individual resources to view their details.
0 commit comments