Skip to content

Commit 167461c

Browse files
authored
Merge branch 'MicrosoftDocs:main' into main
2 parents a047a01 + a3da78f commit 167461c

26 files changed

+450
-493
lines changed

articles/ai-services/openai/concepts/models.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -170,7 +170,7 @@ GPT-3.5 Turbo version 0301 is the first version of the model released. Version
170170
See [model versions](../concepts/model-versions.md) to learn about how Azure OpenAI Service handles model version upgrades, and [working with models](../how-to/working-with-models.md) to learn how to view and configure the model version settings of your GPT-3.5 Turbo deployments.
171171

172172
> [!NOTE]
173-
> Version `0613` of `gpt-35-turbo` and `gpt-35-turbo-16k` will be retired no earlier than June 13, 2024. Version `0301` of `gpt-35-turbo` will be retired no earlier than July 5, 2024. See [model updates](../how-to/working-with-models.md#model-updates) for model upgrade behavior.
173+
> Version `0613` of `gpt-35-turbo` and `gpt-35-turbo-16k` will be retired no earlier than July 13, 2024. Version `0301` of `gpt-35-turbo` will be retired no earlier than June 13, 2024. See [model updates](../how-to/working-with-models.md#model-updates) for model upgrade behavior.
174174
175175
| Model ID | Max Request (tokens) | Training Data (up to) |
176176
| --------- |:------:|:----:|

articles/azure-arc/servers/billing-extended-security-updates.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,30 @@
11
---
22
title: Billing service for Extended Security Updates for Windows Server 2012 through Azure Arc
33
description: Learn about billing services for Extended Security Updates for Windows Server 2012 enabled by Azure Arc.
4-
ms.date: 12/19/2023
4+
ms.date: 04/10/2023
55
ms.topic: conceptual
66
---
77

88
# Billing service for Extended Security Updates for Windows Server 2012 enabled by Azure Arc
99

10-
Billing for Extended Security Updates (ESUs) is impacted by three factors:
10+
Three factors impact billing for Extended Security Updates (ESUs):
1111

12-
- The number of cores you've provisioned
12+
- The number of cores provisioned
1313
- The edition of the license (Standard vs. Datacenter)
1414
- The application of any eligible discounts
1515

16-
Billing is monthly. Decrementing, deactivating, or deleting a license will result in charges for up to five more calendar days from the time of decrement, deactivation, or deletion. Reduction in billing isn't immediate. This is an Azure-billed service and can be used to decrement a customer's Microsoft Azure Consumption Commitment (MACC) and be eligible for Azure Consumption Discount (ACD).
16+
Billing is monthly. Decrementing, deactivating, or deleting a license results in charges for up to five more calendar days from the time of decrement, deactivation, or deletion. Reduction in billing isn't immediate. This is an Azure-billed service and can be used to decrement a customer's Microsoft Azure Consumption Commitment (MACC) and be eligible for Azure Consumption Discount (ACD).
1717

1818
> [!NOTE]
1919
> Licenses or additional cores provisioned after End of Support are subject to a one-time back-billing charge during the month in which the license was provisioned. This isn't reflective of the recurring monthly bill.
2020
2121
## Back-billing for ESUs enabled by Azure Arc
2222

23-
Licenses that are provisioned after the End of Support (EOS) date of October 10, 2023 are charged a back bill for the time elapsed since the EOS date. For example, an ESU license provisioned in December 2023 will be back-billed for October and November upon provisioning. Enrolling late in WS2012 ESUs makes you eligible for all the critical security patches up to that point. The back-billing charge reflects the value of these critical security patches.
23+
Licenses that are provisioned after the End of Support (EOS) date of October 10, 2023 are charged a back bill for the time elapsed since the EOS date. For example, an ESU license provisioned in December 2023 is back-billed for October and November upon provisioning. Enrolling late in WS2012 ESUs makes you eligible for all the critical security patches up to that point. The back-billing charge reflects the value of these critical security patches.
2424

25-
If you deactivate and then later reactivate a license, you'll be billed for the window during which the license was deactivated. It isn't possible to evade charges by deactivating a license before a critical security patch and reactivating it shortly before.
25+
If you deactivate and then later reactivate a license, you're billed for the window during which the license was deactivated. It isn't possible to evade charges by deactivating a license before a critical security patch and reactivating it shortly before.
26+
27+
If the region of an ESU license is changed, this will be subject to back-billing charges.
2628

2729
> [!NOTE]
2830
> The back-billing cost appears as a separate line item in invoicing. If you acquired a discount for your core WS2012 ESUs enabled by Azure Arc, the same discount may or may not apply to back-billing. You should verify that the same discounting, if applicable, has been applied to back-billing charges as well.

articles/azure-resource-manager/templates/deployment-script-template.md

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Use deployment scripts in templates | Microsoft Docs
33
description: Use deployment scripts in Azure Resource Manager templates.
44
ms.custom: devx-track-arm-template
55
ms.topic: conceptual
6-
ms.date: 12/12/2023
6+
ms.date: 04/09/2024
77
---
88

99
# Use deployment scripts in ARM templates
@@ -660,7 +660,7 @@ The identity that your deployment script uses needs to be authorized to work wit
660660
With Microsoft.Resources/deploymentScripts version 2023-08-01, you can run deployment scripts in private networks with some additional configurations.
661661

662662
- Create a user-assigned managed identity, and specify it in the `identity` property. To assign the identity, see [Identity](#identity).
663-
- Create a storage account, and specify the deployment script to use the existing storage account. To specify an existing storage account, see [Use existing storage account](#use-existing-storage-account). Some additional configuration is required for the storage account.
663+
- Create a storage account with [`allowSharedKeyAccess`](/azure/templates/microsoft.storage/storageaccounts) set to `true` , and specify the deployment script to use the existing storage account. To specify an existing storage account, see [Use existing storage account](#use-existing-storage-account). Some additional configuration is required for the storage account.
664664

665665
1. Open the storage account in the [Azure portal](https://portal.azure.com).
666666
1. From the left menu, select **Access Control (IAM)**, and then select the **Role assignments** tab.
@@ -708,7 +708,7 @@ The following ARM template shows how to configure the environment for running a
708708
"resources": [
709709
{
710710
"type": "Microsoft.Network/virtualNetworks",
711-
"apiVersion": "2023-05-01",
711+
"apiVersion": "2023-09-01",
712712
"name": "[parameters('vnetName')]",
713713
"location": "[parameters('location')]",
714714
"properties": {
@@ -761,15 +761,16 @@ The following ARM template shows how to configure the environment for running a
761761
}
762762
],
763763
"defaultAction": "Deny"
764-
}
764+
},
765+
"allowSharedKeyAccess": true
765766
},
766767
"dependsOn": [
767768
"[resourceId('Microsoft.Network/virtualNetworks', parameters('vnetName'))]"
768769
]
769770
},
770771
{
771772
"type": "Microsoft.ManagedIdentity/userAssignedIdentities",
772-
"apiVersion": "2023-01-31",
773+
"apiVersion": "2023-07-31-preview",
773774
"name": "[parameters('userAssignedIdentityName')]",
774775
"location": "[parameters('location')]"
775776
},
@@ -779,7 +780,7 @@ The following ARM template shows how to configure the environment for running a
779780
"scope": "[format('Microsoft.Storage/storageAccounts/{0}', parameters('storageAccountName'))]",
780781
"name": "[guid(tenantResourceId('Microsoft.Authorization/roleDefinitions', '69566ab7-960f-475b-8e7c-b3118f30c6bd'), resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', parameters('userAssignedIdentityName')), resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccountName')))]",
781782
"properties": {
782-
"principalId": "[reference(resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', parameters('userAssignedIdentityName')), '2023-01-31').principalId]",
783+
"principalId": "[reference(resourceId('Microsoft.ManagedIdentity/userAssignedIdentities', parameters('userAssignedIdentityName')), '2023-07-31-preview').principalId]",
783784
"roleDefinitionId": "[tenantResourceId('Microsoft.Authorization/roleDefinitions', '69566ab7-960f-475b-8e7c-b3118f30c6bd')]",
784785
"principalType": "ServicePrincipal"
785786
},

articles/cosmos-db/vector-database.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -149,8 +149,8 @@ The natively integrated vector database in our NoSQL API will become available i
149149
> [!div class="nextstepaction"]
150150
> [Use the Azure Cosmos DB lifetime free tier](free-tier.md)
151151
152-
## More Vector Databases
152+
## More Vector Stores
153153

154154
- [Azure PostgreSQL Server pgvector Extension](../postgresql/flexible-server/how-to-use-pgvector.md)
155-
- [Azure AI Search](../search/search-what-is-azure-search.md)
155+
- [Azure AI Search](../search/vector-store.md)
156156
- [Open Source Vector Databases](mongodb/vcore/vector-search-ai.md)
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11

22
> [!IMPORTANT]
33
>
4-
> As of August 1, customers with an existing subscription to Defender for DNS can continue to use the service, but new subscribers will receive alerts about suspicious DNS activity as part of Defender for Servers P2.
4+
> As of August 1 2023, customers with an existing subscription to Defender for DNS can continue to use the service, but new subscribers will receive alerts about suspicious DNS activity as part of Defender for Servers P2.

articles/hdinsight-aks/flink/azure-databricks.md

Lines changed: 36 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
---
22
title: Incorporate Apache Flink® DataStream into Azure Databricks Delta Lake Table
3-
description: Learn about incorporate Apache Flink® DataStream into Azure Databricks Delta Lake Table
3+
description: Learn about incorporate Apache Flink® DataStream into Azure Databricks Delta Lake Table.
44
ms.service: hdinsight-aks
55
ms.topic: how-to
6-
ms.date: 10/27/2023
6+
ms.date: 04/10/2024
77
---
88

99
# Incorporate Apache Flink® DataStream into Azure Databricks Delta Lake Tables
@@ -12,9 +12,9 @@ This example shows how to sink stream data in Azure ADLS Gen2 from Apache Flink
1212

1313
## Prerequisites
1414

15-
- [Apache Flink 1.16.0 on HDInsight on AKS](../flink/flink-create-cluster-portal.md)
15+
- [Apache Flink 1.17.0 on HDInsight on AKS](../flink/flink-create-cluster-portal.md)
1616
- [Apache Kafka 3.2 on HDInsight](../../hdinsight/kafka/apache-kafka-get-started.md)
17-
- [Azure Databricks](/azure/databricks/getting-started/) in the same VNET as HDInsight on AKS
17+
- [Azure Databricks](/azure/databricks/getting-started/) in the same virtual network as HDInsight on AKS
1818
- [ADLS Gen2](/azure/databricks/getting-started/connect-to-azure-storage/) and Service Principal
1919

2020
## Azure Databricks Auto Loader
@@ -25,9 +25,9 @@ Here are the steps how you can use data from Flink in Azure Databricks delta liv
2525

2626
### Create Apache Kafka® table on Apache Flink® SQL
2727

28-
In this step, you can create Kafka table and ADLS Gen2 on Flink SQL. For the purpose of this document, we are using a airplanes_state_real_time table, you can use any topic of your choice.
28+
In this step, you can create Kafka table and ADLS Gen2 on Flink SQL. In this document, we're using a `airplanes_state_real_time table`. You can use any article of your choice.
2929

30-
You are required to update the broker IPs with your Kafka cluster in the code snippet.
30+
You need to update the broker IPs with your Kafka cluster in the code snippet.
3131

3232
```SQL
3333
CREATE TABLE kafka_airplanes_state_real_time (
@@ -68,34 +68,34 @@ Update the container-name and storage-account-name in the code snippet with your
6868

6969
```SQL
7070
CREATE TABLE adlsgen2_airplanes_state_real_time (
71-
`date` STRING,
72-
`geo_altitude` FLOAT,
73-
`icao24` STRING,
74-
`latitude` FLOAT,
75-
`true_track` FLOAT,
76-
`velocity` FLOAT,
77-
`spi` BOOLEAN,
78-
`origin_country` STRING,
79-
`minute` STRING,
80-
`squawk` STRING,
81-
`sensors` STRING,
82-
`hour` STRING,
83-
`baro_altitude` FLOAT,
84-
`time_position` BIGINT,
85-
`last_contact` BIGINT,
86-
`callsign` STRING,
87-
`event_time` STRING,
88-
`on_ground` BOOLEAN,
89-
`category` STRING,
90-
`vertical_rate` FLOAT,
91-
`position_source` INT,
92-
`current_time` STRING,
93-
`longitude` FLOAT
94-
) WITH (
95-
'connector' = 'filesystem',
96-
'path' = 'abfs://<container-name>@<storage-account-name>/flink/airplanes_state_real_time/',
97-
'format' = 'json'
98-
);
71+
`date` STRING,
72+
`geo_altitude` FLOAT,
73+
`icao24` STRING,
74+
`latitude` FLOAT,
75+
`true_track` FLOAT,
76+
`velocity` FLOAT,
77+
`spi` BOOLEAN,
78+
`origin_country` STRING,
79+
`minute` STRING,
80+
`squawk` STRING,
81+
`sensors` STRING,
82+
`hour` STRING,
83+
`baro_altitude` FLOAT,
84+
`time_position` BIGINT,
85+
`last_contact` BIGINT,
86+
`callsign` STRING,
87+
`event_time` STRING,
88+
`on_ground` BOOLEAN,
89+
`category` STRING,
90+
`vertical_rate` FLOAT,
91+
`position_source` INT,
92+
`current_time` STRING,
93+
`longitude` FLOAT
94+
) WITH (
95+
'connector' = 'filesystem',
96+
'path' = 'abfs://<container-name>@<storage-account-name>.dfs.core.windows.net/data/airplanes_state_real_time/flink/airplanes_state_real_time/',
97+
'format' = 'json'
98+
);
9999
```
100100

101101
Further, you can insert Kafka table into ADLSgen2 table on Flink SQL.
@@ -114,9 +114,9 @@ Further, you can insert Kafka table into ADLSgen2 table on Flink SQL.
114114

115115
ADLS Gen2 provides OAuth 2.0 with your Microsoft Entra application service principal for authentication from an Azure Databricks notebook and then mount into Azure Databricks DBFS.
116116

117-
**Let's get service principle appid, tenant id and secret key.**
117+
**Let's get service principle appid, tenant ID, and secret key.**
118118

119-
:::image type="content" source="media/azure-databricks/service-id.png" alt-text="Screenshot shows get service principle appid, tenant ID and secret key." lightbox="media/azure-databricks/service-id.png":::
119+
:::image type="content" source="media/azure-databricks/service-id.png" alt-text="Screenshot shows get service principle appid, tenant ID, and secret key." lightbox="media/azure-databricks/service-id.png":::
120120

121121
**Grant service principle the Storage Blob Data Owner on Azure portal**
122122

134 KB
Loading
-15.3 KB
Loading
-59.9 KB
Loading

0 commit comments

Comments
 (0)