You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CHANGELOG.md
+14Lines changed: 14 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,19 @@
1
1
# Changelog
2
2
3
+
## [1.0.5] - 11/29/2023
4
+
5
+
### Features
6
+
* (**applicationautoscaling**) Amazon SageMaker customers can now use Application Auto Scaling to automatically scale the number of Inference Component copies across an endpoint to meet the varying demand of their workloads.
7
+
* (**cleanrooms**) AWS Clean Rooms now provides differential privacy to protect against user-identification attempts and machine learning modeling to allow two parties to identify similar users in their data.
8
+
* (**cleanroomsml**) Public Preview SDK release of AWS Clean Rooms ML APIs
9
+
* (**opensearch**) Launching Amazon OpenSearch Service support for new zero-ETL integration with Amazon S3. Customers can now manage their direct query data sources to Amazon S3 programatically
10
+
* (**opensearchserverless**) Amazon OpenSearch Serverless collections support an additional attribute called standby-replicas. This allows to specify whether a collection should have redundancy enabled.
11
+
* (**sagemaker**) This release adds following support 1/ Improved SDK tooling for model deployment. 2/ New Inference Component based features to lower inference costs and latency 3/ SageMaker HyperPod management. 4/ Additional parameters for FM Fine Tuning in Autopilot
12
+
* (**sagemakerruntime**) This release adds InferenceComponentName to InvokeEndpoint and InvokeEndpointWithResponseStream APIs to get inferences from the deployed InferenceComponents.
13
+
14
+
### Documentation
15
+
* (**sts**) Documentation updates for AWS Security Token Service.
0 commit comments