You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/ai-services/speech-service/captioning-concepts.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,7 +27,7 @@ The following are aspects to consider when using captioning:
27
27
* Let your audience know that captions are generated by an automated service.
28
28
* Center captions horizontally on the screen, in a large and prominent font.
29
29
* Consider whether to use partial results, when to start displaying captions, and how many words to show at a time.
30
-
* Learn about captioning protocols such as [SMPTE-TT](https://ieeexplore.ieee.org/document/7291854).
30
+
* Learn about captioning protocols such as [SMPTE-TT](https://pub.smpte.org/doc/st2052-1/20101203-pub/st2052-1-2010.pdf).
31
31
* Consider output formats such as SRT (SubRip Text) and WebVTT (Web Video Text Tracks). These can be loaded onto most video players such as VLC, automatically adding the captions on to your video.
|Enable upon creation| Select for the alert rule to start running as soon as you're done creating it.|
204
-
|Automatically resolve alerts (preview) |Select to make the alert stateful. When an alert is stateful, the alert is resolved when the condition is no longer met for a specific time range. The time range differs based on the frequency of the alert:<br>**1 minute**: The alert condition isn't met for 10 minutes.<br>**5-15 minutes**: The alert condition isn't met for three frequency periods.<br>**15 minutes - 11 hours**: The alert condition isn't met for two frequency periods.<br>**11 to 12 hours**: The alert condition isn't met for one frequency period. <br><br>Note that stateful log search alerts have these [limitations](https://learn.microsoft.com/azure/azure-monitor/service-limits#alerts).|
204
+
|Automatically resolve alerts (preview) |Select to make the alert stateful. When an alert is stateful, the alert is resolved when the condition is no longer met for a specific time range. The time range differs based on the frequency of the alert:<br>**1 minute**: The alert condition isn't met for 10 minutes.<br>**5-15 minutes**: The alert condition isn't met for three frequency periods.<br>**15 minutes - 11 hours**: The alert condition isn't met for two frequency periods.<br>**11 to 12 hours**: The alert condition isn't met for one frequency period. <br><br>Note that stateful log search alerts have these [limitations](/azure/azure-monitor/service-limits#alerts).|
205
205
|Mute actions |Select to set a period of time to wait before alert actions are triggered again. If you select this checkbox, the **Mute actions for** field appears to select the amount of time to wait after an alert is fired before triggering actions again.|
206
206
|Check workspace linked storage|Select if logs workspace linked storage for alerts is configured. If no linked storage is configured, the rule isn't created.|
Copy file name to clipboardExpand all lines: articles/azure-monitor/app/api-custom-events-metrics.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -679,7 +679,7 @@ The function is asynchronous for the [server telemetry channel](https://www.nuge
679
679
680
680
> [!NOTE]
681
681
> - The Java and JavaScript SDKs automatically flush on application shutdown.
682
-
> -**Review Autoflush configuration**: [Enabling autoflush](https://learn.microsoft.com/dotnet/api/system.diagnostics.trace.autoflush) in your `web.config` file can lead to performance degradation in .NET applications instrumented with Application Insights. With autoflush enabled, every invocation of `System.Diagnostics.Trace.Trace*` methods results in individual telemetry items being sent as separate distinct web requests to the ingestion service. This can potentially cause network and storage exhaustion on your web servers. For enhanced performance, it’s recommended to disable autoflush and also, utilize the [ServerTelemetryChannel](https://learn.microsoft.com/azure/azure-monitor/app/telemetry-channels#built-in-telemetry-channels), designed for a more effective telemetry data transmission.
682
+
> -**Review Autoflush configuration**: [Enabling autoflush](/dotnet/api/system.diagnostics.trace.autoflush) in your `web.config` file can lead to performance degradation in .NET applications instrumented with Application Insights. With autoflush enabled, every invocation of `System.Diagnostics.Trace.Trace*` methods results in individual telemetry items being sent as separate distinct web requests to the ingestion service. This can potentially cause network and storage exhaustion on your web servers. For enhanced performance, it’s recommended to disable autoflush and also, utilize the [ServerTelemetryChannel](/azure/azure-monitor/app/telemetry-channels#built-in-telemetry-channels), designed for a more effective telemetry data transmission.
Copy file name to clipboardExpand all lines: articles/azure-vmware/deploy-disaster-recovery-using-jetstream.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -140,13 +140,13 @@ For full details, refer to the article: [Disaster Recovery with Azure NetApp Fil
140
140
-[Attach Azure NetApp Files datastores to Azure VMware Solution hosts](attach-azure-netapp-files-to-azure-vmware-solution-hosts.md)
141
141
-[Disaster Recovery with Azure NetApp Files, JetStream DR, and Azure VMware Solution](https://www.jetstreamsoft.com/portal/jetstream-knowledge-base/disaster-recovery-with-azure-netapp-files-jetstream-dr-and-avs-azure-vmware-solution/)
142
142
143
-
For more on-premises JetStream DR prerequisites, see the [JetStream Pre-Installation Guide](https://jetstreamsoft.com/portal/online-docs/jsdr-admin_4.2/Pre-Installation.html).
143
+
For more on-premises JetStream DR prerequisites, see the [JetStream Pre-Installation Guide](https://jetstreamsoft.com/portal/online-docs/jsdr-admin_4.2/preinstallation.html).
144
144
145
145
## Install JetStream DR on Azure VMware Solution
146
146
147
147
You can follow these steps for both supported scenarios.
148
148
149
-
1. In your on-premises data center, install JetStream DR following the [JetStream documentation](https://jetstreamsoft.com/portal/online-docs/jsdr-admin_4.2/Installation.html).
149
+
1. In your on-premises data center, install JetStream DR following the [JetStream documentation](https://jetstreamsoft.com/portal/online-docs/jsdr-admin_4.2/installation.html).
150
150
151
151
1. In your Azure VMware Solution private cloud, install JetStream DR using a Run command. From the [Azure portal](https://portal.azure.com),select **Run command** > **Packages** > **JSDR.Configuration**.
152
152
@@ -260,7 +260,7 @@ Once JetStream DR MSA and JetStream VIB are installed on the Azure VMware Soluti
260
260
261
261
1.[Add an external storage site](https://www.jetstreamsoft.com/portal/jetstream-knowledge-base/add-a-storage-site/).
262
262
263
-
1.[Deploy a JetStream DRVA appliance](https://jetstreamsoft.com/portal/online-docs/jsdr-admin_4.2/DeployaDRVA.html).
263
+
1.[Deploy a JetStream DRVA appliance](https://jetstreamsoft.com/portal/online-docs/jsdr-admin_4.2/deploydrva.html).
264
264
265
265
1. Create a JetStream replication log store volume using one of the datastores available to the Azure VMware Solution cluster.
Copy file name to clipboardExpand all lines: articles/azure-vmware/license-sql-windows-in-avs.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -75,36 +75,36 @@ Register your licenses in Azure VMware Solution.
75
75
You can register and manage your licenses with SQL Server.
76
76
77
77
##### License the host by using unlimited virtualization
78
-
You can enable Azure Hybrid Benefit for SQL Server and achieve unlimited virtualization through an Azure VMware Solution placement policy. For information on how to create the VM-Host placement policy, see [Enable unlimited virtualization with Azure Hybrid Benefit for SQL Server in Azure VMware Solution](https://learn.microsoft.com/azure/azure-vmware/enable-sql-azure-hybrid-benefit).
78
+
You can enable Azure Hybrid Benefit for SQL Server and achieve unlimited virtualization through an Azure VMware Solution placement policy. For information on how to create the VM-Host placement policy, see [Enable unlimited virtualization with Azure Hybrid Benefit for SQL Server in Azure VMware Solution](/azure/azure-vmware/enable-sql-azure-hybrid-benefit).
79
79
80
80
##### License a virtual machine
81
81
You can register SQL Server licenses and apply them to VMs running SQL Server in Azure VMware Solution by registering through Azure Arc:
82
-
1. Azure VMware Solution must be Azure Arc-enabled. For more information, see [Deploy Azure Arc-enabled VMware vSphere for Azure VMware Solution](https://learn.microsoft.com/azure/azure-vmware/deploy-arc-for-azure-vmware-solution?tabs=windows). You can Azure Arc-enable the VMs and install extensions to that VM by following the steps provided in the section titled "Enable guest management and extension installation."
82
+
1. Azure VMware Solution must be Azure Arc-enabled. For more information, see [Deploy Azure Arc-enabled VMware vSphere for Azure VMware Solution](/azure/azure-vmware/deploy-arc-for-azure-vmware-solution). You can Azure Arc-enable the VMs and install extensions to that VM by following the steps provided in the section titled "Enable guest management and extension installation."
83
83
1. When **Guest Management** is configured, the Azure Extension for SQL Server should be installed on that VM. The extension installation enables you to configure the license type for the SQL Server instance running in the VM.
84
84
1. Now you can configure the license type and other SQL Server configuration settings by using the Azure portal, PowerShell, or the Azure CLI for a specific Azure Arc-enabled server. To configure from the Azure portal with VMware vSphere in the Azure VMware Solution experience, follow these steps:
85
85
86
86
1. In the Azure VMware Solution portal, go to **vCenter Server Inventory** and **Virtual Machines** by clicking through one of the Azure Arc-enabled VMs. The **Machine-Azure Arc (AVS)** page appears.
87
87
1. On the left pane, under **Operations**, select **SQL Server Configuration**.
88
88
1. You can now apply and save your license type for the VM along with other server configurations.
89
89
90
-
You can also configure these settings within the Azure Arc portal experience and by using PowerShell or the Azure CLI. To access the Azure Arc portal experience and code to update the configuration values, see [Configure SQL Server enabled by Azure Arc](https://learn.microsoft.com/sql/sql-server/azure-arc/manage-configuration?view=sql-server-ver16&tabs=azure).
90
+
You can also configure these settings within the Azure Arc portal experience and by using PowerShell or the Azure CLI. To access the Azure Arc portal experience and code to update the configuration values, see [Configure SQL Server enabled by Azure Arc](/sql/sql-server/azure-arc/manage-configuration).
91
91
92
-
For available license types, see [License types](https://learn.microsoft.com/sql/sql-server/azure-arc/manage-license-billing?view=sql-server-ver16#license-types).
92
+
For available license types, see [License types](/sql/sql-server/azure-arc/manage-license-billing).
93
93
94
94
> [!NOTE]
95
95
> At this time, Azure VMware Solution doesn't have support for the new `SQLServerLicense` resource type.
96
96
97
97
##### Manage the environment
98
-
After the Azure Extension for SQL Server is installed, you can query the SQL Server configuration settings and track your SQL Server license inventory for each VM. For sample queries, see [Query SQL Server configuration](https://learn.microsoft.com/sql/sql-server/azure-arc/manage-configuration?view=sql-server-ver16&tabs=azure#query-sql-server-configuration).
98
+
After the Azure Extension for SQL Server is installed, you can query the SQL Server configuration settings and track your SQL Server license inventory for each VM. For sample queries, see [Query SQL Server configuration](/sql/sql-server/azure-arc/manage-configuration#query-sql-server-configuration).
99
99
100
100
#### Windows Server
101
101
Currently, there's no way to register your Windows licenses in Azure VMware Solution.
102
102
103
103
### Other cost savings for SQL Server and Windows Server
104
104
For more cost savings with Azure VMware Solution, see:
105
105
106
-
-[Extended Security Updates (ESUs) for Windows Server and SQL Server - Azure VMware Solution](https://learn.microsoft.com/azure/azure-vmware/extended-security-updates-windows-sql-server)
107
-
-[Save costs with a reserved instance](https://learn.microsoft.com/azure/azure-vmware/reserved-instance)
106
+
-[Extended Security Updates (ESUs) for Windows Server and SQL Server - Azure VMware Solution](/azure/azure-vmware/extended-security-updates-windows-sql-server)
107
+
-[Save costs with a reserved instance](/azure/azure-vmware/reserved-instance)
Copy file name to clipboardExpand all lines: articles/confidential-computing/partner-pages/habu.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,7 +19,7 @@ Data clean rooms allow organizations to share data and collaborate on analytics
19
19
20
20
Collaboration partners can now participate in cross-cloud, cross-region data sharing - with protections against unauthorized access to data across partners, cloud providers, and even Habu. You can hear more from Habu’s Chief Product Officer, Matthew Karasick, on their [partnership with Azure here](https://build.microsoft.com/en-US/home?source=partnerdetail).
21
21
22
-
You can also get started on their [Azure Marketplace solution](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/habuinc1663874067667.habu?tab=Overview), today.
22
+
You can also get started on their [Azure Marketplace solution](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/habuinc1663874067667.habu), today.
Copy file name to clipboardExpand all lines: articles/hdinsight-aks/flink/flink-job-orchestration.md
+3-6Lines changed: 3 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -103,8 +103,7 @@ You can read more details about DAGs, Control Flow, SubDAGs, TaskGroups, etc. di
103
103
104
104
Example code is available on the [git](https://github.com/Azure-Samples/hdinsight-aks/blob/main/flink/airflow-python-sample-code); download the code locally on your computer and upload the wordcount.py to a blob storage. Follow the [steps](/azure/data-factory/how-does-workflow-orchestration-manager-work#steps-to-import) to import DAG into your workflow created during setup.
105
105
106
-
The wordcount.py is an example of orchestrating a Flink job submission using Apache Airflow with HDInsight on AKS. The example is based on the wordcount example provided on [Apache Flink](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/dataset/examples/).
107
-
106
+
The wordcount.py is an example of orchestrating a Flink job submission using Apache Airflow with HDInsight on AKS.
108
107
The DAG has two tasks:
109
108
110
109
- get `OAuth Token`
@@ -164,11 +163,9 @@ The DAG expects to have setup for the Service Principal, as described during the
164
163
165
164
## Example code
166
165
167
-
This is an example of orchestrating data pipeline using Airflow with HDInsight on AKS
168
-
169
-
The example is based on wordcount example provided on [Apache Flink](https://nightlies.apache.org/flink/flink-docs-master/docs/dev/dataset/examples/)
166
+
This is an example of orchestrating data pipeline using Airflow with HDInsight on AKS.
170
167
171
-
The DAG expects to have setup for Service Principal for the OAuth Client credential and pass following input configuration for the execution
168
+
The DAG expects to have setup for Service Principal for the OAuth Client credential and pass following input configuration for the execution:
- [Hive Dialect in Apache Flink](https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/hive_dialect/#hive-dialect)
160
+
- [Hive Dialect in Apache Flink](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/dev/table/hive-compatibility/hive-dialect/overview/)
161
161
- Apache, Apache Flink, Flink, and associated open source project names are [trademarks](../trademarks.md) of the [Apache Software Foundation](https://www.apache.org/) (ASF).
Copy file name to clipboardExpand all lines: articles/iot-operations/connect-to-cloud/howto-configure-mqtt-bridge.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -407,7 +407,7 @@ If using managed identity isn't possible, keep the per-connection limits for Eve
407
407
408
408
## Bridge from another broker to Azure IoT MQ Preview
409
409
410
-
Azure IoT MQ is a compliant MQTT broker and other brokers can bridge to it with the appropriate authentication and authorization credentials. For example, see MQTT bridge documentation for [HiveMQ](https://www.hivemq.com/docs/bridge/4.8/enterprise-bridge-extension/bridge-extension.html), [VerneMQ](https://docs.vernemq.com/configuring-vernemq/bridge), [EMQX](https://www.emqx.io/docs/en/v5/data-integration/data-bridge-mqtt.html), and [Mosquitto](https://mosquitto.org/man/mosquitto-conf-5.html).
410
+
Azure IoT MQ is a compliant MQTT broker and other brokers can bridge to it with the appropriate authentication and authorization credentials. For example, see MQTT bridge documentation for [HiveMQ](https://www.hivemq.com/docs/bridge/4.8/enterprise-bridge-extension/bridge-extension.html), [VerneMQ](https://docs.vernemq.com/configuring-vernemq/bridge), [EMQX](https://docs.emqx.com/en/enterprise/latest/data-integration/data-bridge-mqtt.html), and [Mosquitto](https://mosquitto.org/man/mosquitto-conf-5.html).
Copy file name to clipboardExpand all lines: articles/postgresql/single-server/concepts-aks.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -46,7 +46,7 @@ az network nic list --resource-group nodeResourceGroup -o table
46
46
47
47
A connection pooler minimizes the cost and time associated with creating and closing new connections to the database. The pool is a collection of connections that can be reused.
48
48
49
-
There are multiple connection poolers you can use with PostgreSQL. One of these is [PgBouncer](https://pgbouncer.github.io/). In the Microsoft Container Registry, we provide a lightweight containerized PgBouncer that can be used in a sidecar to pool connections from AKS to Azure Database for PostgreSQL. Visit the [docker hub page](https://hub.docker.com/r/microsoft/azureossdb-tools-pgbouncer/) to learn how to access and use this image.
49
+
There are multiple connection poolers you can use with PostgreSQL. One of these is [PgBouncer](https://pgbouncer.github.io/). In the Microsoft Container Registry, we provide a lightweight containerized PgBouncer that can be used in a sidecar to pool connections from AKS to Azure Database for PostgreSQL. Visit the [docker hub page](https://hub.docker.com/_/microsoft-azure-oss-db-tools-pgbouncer-sidecar) to learn how to access and use this image.
0 commit comments