You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/content-moderator/includes/tool-deprecation.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -12,7 +12,7 @@ ms.author: pafarley
12
12
13
13
14
14
> [!IMPORTANT]
15
-
> Azure Content Moderator is deprecated as of February 2024 and will be retired by February 2027. It is replaced by [Azure AI Content Safety](/azure/ai-services/content-safety/overview), which offers advanced AI features and enhanced performance.
15
+
> Azure Content Moderator is deprecated as of February 2024 and will be retired on March 15, 2027. It is replaced by [Azure AI Content Safety](/azure/ai-services/content-safety/overview), which offers advanced AI features and enhanced performance.
16
16
>
17
17
> Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers. Here's an overview of its features and capabilities:
Copy file name to clipboardExpand all lines: articles/ai-services/openai/how-to/fine-tuning-deploy.md
+7-61Lines changed: 7 additions & 61 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -366,7 +366,9 @@ Azure OpenAI fine-tuning supports the following deployment types.
366
366
|GPT-35-Turbo-1106-finetune|East US2, North Central US, Sweden Central, Switzerland West|
367
367
|GPT-35-Turbo-0125-finetune|East US2, North Central US, Sweden Central, Switzerland West|
368
368
369
-
### Global Standard (preview)
369
+
### Global Standard
370
+
371
+
[Global standard](./deployment-types.md#global-standard) fine-tuned deployments offer [cost savings](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/), but custom model weights may temporarily be stored outside the geography of your Azure OpenAI resource.
370
372
371
373
| Models | Region |
372
374
|--|--|
@@ -375,72 +377,16 @@ Azure OpenAI fine-tuning supports the following deployment types.
375
377
|GPT-4o-finetune|East US2, North Central US, and Sweden Central|
376
378
|GPT-4o-mini-finetune|East US2, North Central US, and Sweden Central|
377
379
378
-
[Global standard](./deployment-types.md#global-standard) fine-tuned deployments offer [cost savings](https://azure.microsoft.com/pricing/details/cognitive-services/openai-service/), but custom model weights may temporarily be stored outside the geography of your Azure OpenAI resource.
379
-
380
380
:::image type="content" source="../media/fine-tuning/global-standard.png" alt-text="Screenshot of the global standard deployment user experience with a fine-tuned model." lightbox="../media/fine-tuning/global-standard.png":::
381
381
382
-
### Provisioned Managed (preview)
382
+
### Provisioned Managed
383
383
384
384
| Models | Region |
385
385
|--|--|
386
-
|GPT-4o-finetune|North Central US, Switzerland West|
387
-
|GPT-4o-mini-finetune|North Central US, Switzerland West|
388
-
389
-
-`gpt-4o-mini-2024-07-18`
390
-
-`gpt-4o-2024-08-06`
391
-
392
-
[Provisioned managed](./deployment-types.md#provisioned) fine-tuned deployments offer [predictable performance](../concepts/provisioned-throughput.md) for fine-tuned deployments. As part of public preview, provisioned managed deployments may be created regionally via the data-plane [REST API](../reference.md#data-plane-inference) version `2024-10-01` or newer. See below for examples.
393
-
394
-
#### Creating a Provisioned Managed deployment
395
-
396
-
To create a new deployment, make an HTTP PUT call via the [Deployments - Create or Update REST API](/rest/api/aiservices/accountmanagement/deployments/create-or-update?view=rest-aiservices-accountmanagement-2024-10-01&tabs=HTTP&preserve-view=true). The approach is similar to performing [cross region deployment](#cross-region-deployment) with the following exceptions:
397
-
398
-
- You must provide a `sku` name of `ProvisionedManaged`.
399
-
- The capacity must be declared in PTUs.
400
-
- The `api-version` must be `2024-10-01` or newer.
401
-
- The HTTP method should be `PUT`.
402
-
403
-
For example, to deploy a gpt-4o-mini model:
404
-
405
-
```bash
406
-
curl -X PUT "https://management.azure.com/subscriptions/<SUBSCRIPTION>/resourceGroups/<RESOURCE_GROUP>/providers/Microsoft.CognitiveServices/accounts/<RESOURCE_NAME>/deployments/<MODEL_DEPLOYMENT_NAME>api-version=2024-10-01" \
#### Scaling a fine-tuned model on Provisioned Managed
386
+
|GPT-4o-finetune|North Central US, Sweden Central|
387
+
|GPT-4o-mini-finetune|North Central US, Sweden Central|
423
388
424
-
To scale a fine-tuned provision managed deployment to increase or decrease PTU capacity, perform the same `PUT` REST API call as you did when [creating the deployment](#creating-a-provisioned-managed-deployment) and provide an updated `capacity` value for the `sku`. Keep in mind, provisioned deployments must scale in [minimum increments](../how-to/provisioned-throughput-onboarding.md#how-much-throughput-per-ptu-you-get-for-each-model).
425
-
426
-
For example, to scale the model deployed in the previous section from 25 to 40 PTU, make another `PUT` call and increase the capacity:
427
-
428
-
```bash
429
-
curl -X PUT "https://management.azure.com/subscriptions/<SUBSCRIPTION>/resourceGroups/<RESOURCE_GROUP>/providers/Microsoft.CognitiveServices/accounts/<RESOURCE_NAME>/deployments/<MODEL_DEPLOYMENT_NAME>api-version=2024-10-01" \
[Provisioned managed](./deployment-types.md#provisioned) fine-tuned deployments offer [predictable performance](../concepts/provisioned-throughput.md) for latency-sensitive agents and applications. They use the same regional provisioned throughput (PTU) capacity as base models, so if you already have regional PTU quota you can deploy your fine-tuned model in support regions.
Copy file name to clipboardExpand all lines: articles/ai-services/speech-service/custom-neural-voice-lite.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,16 +6,16 @@ author: eric-urban
6
6
manager: nitinme
7
7
ms.service: azure-ai-speech
8
8
ms.topic: how-to
9
-
ms.date: 3/10/2025
9
+
ms.date: 5/12/2025
10
10
ms.author: eur
11
11
---
12
12
13
-
# Custom neural voice lite (preview)
13
+
# Custom neural voice lite
14
14
15
15
Azure AI Speech provides two custom neural voice (CNV) project types: CNV lite and CNV professional.
16
16
17
17
- Custom neural voice (CNV) professional allows you to upload your training data collected through professional recording studios and create a higher-quality voice that is nearly indistinguishable from its human samples. CNV professional access is limited based on eligibility and usage criteria. Request access on the [intake form](https://aka.ms/customneural).
18
-
- Custom neural voice (CNV) lite is a project type in public preview. You can demo and evaluate custom neural voice before investing in professional recordings to create a higher-quality voice. No application is required for demo and evaluation purposes. However, Microsoft restricts and selects the recording and testing samples for use with CNV lite. You must apply for full access to CNV professional in order to deploy and use the CNV lite model for business purpose. In that case, request access on the [intake form](https://aka.ms/customneural).
18
+
- Custom neural voice (CNV) lite is a project type where can demo and evaluate custom neural voice before investing in professional recordings to create a higher-quality voice. No application is required for demo and evaluation purposes. However, Microsoft restricts and selects the recording and testing samples for use with CNV lite. You must apply for full access to CNV professional in order to deploy and use the CNV lite model for business purpose. In that case, request access on the [intake form](https://aka.ms/customneural).
19
19
20
20
With a CNV lite project, you record your voice online by reading 20-50 pre-defined scripts provided by Microsoft. After you've recorded at least 20 samples, you can start to train a model. Once the model is trained successfully, you can review the model and check out 20 output samples produced with another set of pre-defined scripts.
21
21
@@ -25,7 +25,7 @@ See the [supported languages](language-support.md?tabs=tts) for custom neural vo
25
25
26
26
The following table summarizes key differences between the CNV lite and CNV professional project types.
27
27
28
-
|**Items**|**Lite (Preview)**|**Pro**|
28
+
|**Items**|**Lite**|**Pro**|
29
29
|---------------|---------------|---------------|
30
30
|Target scenarios |Demonstration or evaluation |Professional scenarios like brand and character voices for chat bots, or audio content reading.|
31
31
|Training data |Record online using Speech Studio |Bring your own data. Recording in a professional studio is recommended. |
Copy file name to clipboardExpand all lines: articles/machine-learning/azure-machine-learning-release-notes-cli-v2.md
+15-13Lines changed: 15 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -24,43 +24,45 @@ __RSS feed__: Get notified when this page is updated by copying and pasting the
24
24
25
25
## 2025-05-09
26
26
27
-
## Azure Machine Learning CLI (v2) v 2.37.0
27
+
###Azure Machine Learning CLI (v2) v 2.37.0
28
28
-`az ml workspace create`
29
29
- Hub and Project workspace marked as GA.
30
30
31
-
## 2025-04-23
31
+
## 2025-04-24
32
32
33
-
## Azure Machine Learning CLI (v2) v 2.36.5
33
+
###Azure Machine Learning CLI (v2) v 2.36.5
34
34
- Pin major version of external dependencies in SDK.
35
35
36
-
## Azure Machine Learning CLI (v2) v 2.36.4
36
+
## 2025-04-18
37
+
38
+
### Azure Machine Learning CLI (v2) v 2.36.4
37
39
- Updated marshmallow dependency to restrict versions to >=3.5,<4.0.0 to ensure compatibility.
38
40
39
-
## 2025-04-17
41
+
## 2025-04-16
40
42
41
-
## Azure Machine Learning CLI (v2) v 2.36.3
43
+
###Azure Machine Learning CLI (v2) v 2.36.3
42
44
- Removing reference of deprecated package distutils.
43
45
44
-
## 2025-04-09
46
+
## 2025-04-10
45
47
46
-
## Azure Machine Learning CLI (v2) v 2.36.2
48
+
###Azure Machine Learning CLI (v2) v 2.36.2
47
49
-`az ml capability-host create`
48
50
- Made AI Search connections property optional.
49
51
50
-
## 2025-04-01
52
+
## 2025-04-02
51
53
52
-
## Azure Machine Learning CLI (v2) v 2.36.1
54
+
###Azure Machine Learning CLI (v2) v 2.36.1
53
55
- Handle missing duration value in deployment poller result.
54
56
55
57
## 2025-03-14
56
58
57
-
## Azure Machine Learning CLI (v2) v 2.36.0
59
+
###Azure Machine Learning CLI (v2) v 2.36.0
58
60
-`az ml compute update`
59
61
- Fix updating compute when ssh is enabled.
60
62
61
63
## 2025-01-08
62
64
63
-
## Azure Machine Learning CLI (v2) v 2.34.0
65
+
###Azure Machine Learning CLI (v2) v 2.34.0
64
66
-`az ml workspace update --network-acls`
65
67
- Added `--network-acls` property to allow user to specify IPs or IP ranges in CIDR notation for workspace access.
66
68
-`az ml capability-host`
@@ -70,7 +72,7 @@ __RSS feed__: Get notified when this page is updated by copying and pasting the
70
72
71
73
## 2024-12-17
72
74
73
-
## Azure Machine Learning CLI (v2) v 2.33.0
75
+
###Azure Machine Learning CLI (v2) v 2.33.0
74
76
-`az ml workspace create --provision-network-now`
75
77
- Added `--provision-network-now` property to trigger the provisioning of the managed network when creating a workspace with the managed network enabled, or else it does nothing.
0 commit comments