Skip to content

Commit 6c09135

Browse files
authored
Merge pull request #1298 from MicrosoftDocs/main
11/5/2024 PM Publish
2 parents 81377f7 + fe7c99a commit 6c09135

File tree

5 files changed

+34
-12
lines changed

5 files changed

+34
-12
lines changed

articles/ai-services/language-service/personally-identifiable-information/concepts/conversations-entity-categories.md

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -154,8 +154,6 @@ This category contains the following entities:
154154
Any numeric or alphanumeric identifier that could contain any PII information.
155155
Examples:
156156
* Case Number
157-
* Driver's license
158-
* Medicare Beneficiary Identifier (MBI)
159157
* Member Number
160158
* Ticket number
161159
* Bank account number

articles/ai-services/language-service/whats-new.md

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,9 +15,17 @@ ms.author: jboback
1515

1616
Azure AI Language is updated on an ongoing basis. To stay up-to-date with recent developments, this article provides you with information about new releases and features.
1717

18+
## November 2024
19+
20+
* [Native document support](native-document-support/use-native-documents.md) is now available in public preview `2024-11-15-preview` without gated preview limitations.
21+
22+
## October 2024
23+
24+
* Custom language service features enable you to deploy your project to multiple [resources within a single region](concepts/custom-features/multi-region-deployment.md) via the API, so that you can use your custom model wherever you need.
25+
1826
## September 2024
1927

20-
* PII detection now has container support. See more details in the Azure Update post: [Announcing Text PII Redaction Container Release](https://techcommunity.microsoft.com/t5/ai-azure-ai-services-blog/announcing-text-pii-redaction-container-release/ba-p/4264655).
28+
* PII detection now has container support. See more details in the Azure Update post: [Announcing Text PII Redaction Container Release](https://techcommunity.microsoft.com/blog/azure-ai-services-blog/announcing-text-pii-redaction-container-release/4264655).
2129
* Custom sentiment analysis (preview) will be retired on January 10th, 2025. Please transition to other custom model training services, such as custom text classification in Azure AI Language, by that date.  See more details in the Azure Update post: [Retirement: Announcing upcoming retirement of custom sentiment analysis (preview) in Azure AI Language (microsoft.com)](https://azure.microsoft.com/updates/v2/custom-sentiment-analysis-retirement).
2230
* Custom text analytics for health (preview) will be retired on January 10th, 2025. Please transition to other custom model training services, such as custom named entity recognition in Azure AI Language, by that date.  See more details in the Azure Update post: [Retirement: Announcing upcoming retirement of custom text analytics for health (preview) in Azure AI Language (microsoft.com)](https://azure.microsoft.com/updates/v2/custom-text-analytics-for-health-retirement).
2331

@@ -27,7 +35,7 @@ Azure AI Language is updated on an ongoing basis. To stay up-to-date with recent
2735

2836
## July 2024
2937

30-
* [Conversational PII redaction](https://techcommunity.microsoft.com/t5/ai-azure-ai-services-blog/announcing-conversational-pii-detection-service-s-general/ba-p/4162881) service in English-language contexts is now Generally Available (GA).
38+
* [Conversational PII redaction](https://techcommunity.microsoft.com/blog/ai-azure-ai-services-blog/announcing-conversational-pii-detection-service-s-general/4162881) service in English-language contexts is now Generally Available (GA).
3139
* Conversation Summarization now supports 12 additional languages in preview as listed [here](summarization/language-support.md).
3240
* Summarization Meeting or Conversation Chapter titles features will now support reduced length to focus on the key topics.
3341
* Enable support for data augmentation for diacritics to generate variations of training data for diacritic variations used in some natural languages which is especially useful for Germanic and Slavic languages.

articles/ai-services/openai/includes/gpt-4-turbo.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,6 +27,9 @@ This is the replacement for the following preview models:
2727

2828
- Azure AI specific Vision enhancements integration with GPT-4 Turbo with Vision isn't supported for `gpt-4` **Version:** `turbo-2024-04-09`. This includes Optical Character Recognition (OCR), object grounding, video prompts, and improved handling of your data with images.
2929

30+
> [!IMPORTANT]
31+
> Vision enhancements preview features including Optical Character Recognition (OCR), object grounding, video prompts will be retired and no longer available once `gpt-4` Version: `vision-preview` is upgraded to `turbo-2024-04-09`. If you are currently relying on any of these preview features, this automatic model upgrade will be a breaking change.
32+
3033
### GPT-4 Turbo provisioned managed availability
3134

3235
- `gpt-4` **Version:** `turbo-2024-04-09` is available for both standard and provisioned deployments. Currently the provisioned version of this model **doesn't support image/vision inference requests**. Provisioned deployments of this model only accept text input. Standard model deployments accept both text and image/vision inference requests.

articles/ai-studio/how-to/concept-data-privacy.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,8 @@ author: s-polly
1414
---
1515
# Data, privacy, and security for use of models through the model catalog in AI Studio
1616

17+
[!INCLUDE [feature-preview](../includes/feature-preview.md)]
18+
1719
This article describes how the data that you provide is processed, used, and stored when you deploy models from the model catalog. Also see the [Microsoft Products and Services Data Protection Addendum](https://aka.ms/DPA), which governs data processing by Azure services.
1820

1921
> [!IMPORTANT]
@@ -43,8 +45,6 @@ When you deploy a model from the model catalog (base or fine-tuned) by using ser
4345

4446
The model processes your input prompts and generates outputs based on its functionality, as described in the model details. Your use of the model (along with the provider's accountability for the model and its outputs) is subject to the license terms for the model. Microsoft provides and manages the hosting infrastructure and API endpoint. The models hosted in this *model as a service* (MaaS) scenario are subject to Azure data, privacy, and security commitments. [Learn more about Azure compliance offerings applicable to Azure AI Studio](https://servicetrust.microsoft.com/DocumentPage/7adf2d9e-d7b5-4e71-bad8-713e6a183cf3).
4547

46-
[!INCLUDE [feature-preview](../includes/feature-preview.md)]
47-
4848
Microsoft acts as the data processor for prompts and outputs sent to, and generated by, a model deployed for pay-as-you-go inferencing (MaaS). Microsoft doesn't share these prompts and outputs with the model provider. Also, Microsoft doesn't use these prompts and outputs to train or improve Microsoft models, the model provider's models, or any third party's models.
4949

5050
Models are stateless, and they don't store any prompts or outputs. If content filtering (preview) is enabled, the Azure AI Content Safety service screens prompts and outputs for certain categories of harmful content in real time. [Learn more about how Azure AI Content Safety processes data](/legal/cognitive-services/content-safety/data-privacy).

articles/ai-studio/how-to/develop/llama-index.md

Lines changed: 19 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,7 @@ llm = AzureAICompletionsModel(
8383
```
8484
8585
> [!TIP]
86-
> If your model is an OpenAI model deployed to Azure OpenAI service or AI services resource, configure the client as indicated at [Azure OpenAI models and Azure AI model inference service](#azure-openai-models-and-azure-ai-model-infernece-service).
86+
> If your model deployment is hosted in Azure OpenAI service or Azure AI Services resource, configure the client as indicated at [Azure OpenAI models and Azure AI model inference service](#azure-openai-models-and-azure-ai-model-inference-service).
8787
8888
If your endpoint is serving more than one model, like with the [Azure AI model inference service](../../ai-services/model-inference.md) or [GitHub Models](https://github.com/marketplace/models), you have to indicate `model_name` parameter:
8989
@@ -128,23 +128,36 @@ llm = AzureAICompletionsModel(
128128
)
129129
```
130130

131-
### Azure OpenAI models and Azure AI model infernece service
131+
### Azure OpenAI models and Azure AI model inference service
132132

133-
If you are using Azure OpenAI models or [Azure AI model inference service](../../ai-services/model-inference.md), ensure you have at least version `0.2.4` of the LlamaIndex integration. Use `api_version` parameter in case you need to select a specific `api_version`. For the [Azure AI model inference service](../../ai-services/model-inference.md), you need to pass `model_name` parameter:
133+
If you are using Azure OpenAI service or [Azure AI model inference service](../../ai-services/model-inference.md), ensure you have at least version `0.2.4` of the LlamaIndex integration. Use `api_version` parameter in case you need to select a specific `api_version`.
134+
135+
For the [Azure AI model inference service](../../ai-services/model-inference.md), you need to pass `model_name` parameter:
134136

135137
```python
136138
from llama_index.llms.azure_inference import AzureAICompletionsModel
137139
138140
llm = AzureAICompletionsModel(
139-
endpoint=os.environ["AZURE_INFERENCE_ENDPOINT"],
141+
endpoint="https://<resource>.services.ai.azure.com/models",
142+
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
143+
model_name="mistral-large-2407",
144+
)
145+
```
146+
147+
For Azure OpenAI service:
148+
149+
```python
150+
from llama_index.llms.azure_inference import AzureAICompletionsModel
151+
152+
llm = AzureAICompletionsModel(
153+
endpoint="https://<resource>.openai.azure.com/openai/deployments/<deployment-name>",
140154
credential=os.environ["AZURE_INFERENCE_CREDENTIAL"],
141-
model_name="gpt-4o",
142155
api_version="2024-05-01-preview",
143156
)
144157
```
145158

146159
> [!TIP]
147-
> Using a wrong `api_version` or one not supported by the model results in a `ResourceNotFound` exception.
160+
> Check which is the API version that your deployment is using. Using a wrong `api_version` or one not supported by the model results in a `ResourceNotFound` exception.
148161

149162
### Inference parameters
150163

0 commit comments

Comments
 (0)