Skip to content

Commit cef4449

Browse files
committed
Merge branch 'main' of https://github.com/MicrosoftDocs/azure-docs-pr into heidist-gh
2 parents c62b3be + 2cdeaec commit cef4449

File tree

39 files changed

+235
-174
lines changed

39 files changed

+235
-174
lines changed

articles/app-service/toc.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,7 +29,7 @@
2929
href: quickstart-wordpress.md
3030
- name: Deploy Go (experimental)
3131
href: quickstart-golang.md
32-
- name: Deployment
32+
- name: Deployment and configuration
3333
items:
3434
- name: Deployment best practices
3535
href: deploy-best-practices.md
@@ -55,7 +55,7 @@
5555
href: configure-language-java.md
5656
- name: Configure Ruby
5757
href: configure-language-ruby.md
58-
- name: Deploy an app
58+
- name: Deployment methods
5959
items:
6060
- name: Use ZIP or WAR
6161
href: deploy-zip.md

articles/azure-resource-manager/bicep/deployment-stacks.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: Create & deploy deployment stacks in Bicep
33
description: Describes how to create deployment stacks in Bicep.
44
ms.topic: conceptual
5-
ms.date: 07/10/2023
5+
ms.date: 07/12/2023
66
---
77

88
# Deployment stacks (Preview)
@@ -35,8 +35,9 @@ Deployment stacks provide the following benefits:
3535
- [What-if](./deploy-what-if.md) isn't available in the preview.
3636
- Management group scoped deployment stacks can only deploy the template to subscription.
3737
- When using the Azure CLI create command to modify an existing stack, the deployment process continues regardless of whether you choose _n_ for a prompt. To halt the procedure, use _[CTRL] + C_.
38+
- There is an issue with the Azure CLI create command when the value `none` is passed to the `deny-settings-mode` parameter. Before the issue is fixed, use the `denyDelete` instead of `none`.
3839
- If you create or modify a deployment stack in the Azure portal, deny settings will be overwritten (support for deny settings in the Azure portal is currently in progress).
39-
- Management group deployment stacks not yet available in the Azure portal.
40+
- Management group deployment stacks are not yet available in the Azure portal.
4041

4142

4243
## Create deployment stacks

articles/communication-services/concepts/telephony/trial-phone-numbers-faq.md

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,10 @@ While the trial phone number itself is provided at no cost during the trial peri
4949
Verifying the recipient phone number is a security measure that ensures the trial phone number can only make calls to the verified number. This helps protect against misuse and unauthorized usage of trial phone numbers.
5050

5151
### How is the recipient phone number verified?
52-
The verification process involves sending a one-time passcode via SMS to the recipient phone number. The recipient needs to enter this code in the Azure portal to complete the verification.
52+
The verification process involves sending a one-time passcode via SMS to the recipient phone number. The recipient needs to enter this code in the Azure portal to complete the verification.
53+
54+
### From where can I verify phone numbers?
55+
Currently, only phone numbers that originate from the United States (i.e., have a +1 preffix) can be verified for use with trial phone numbers.
5356

5457
### Can I verify multiple recipient phone numbers for the same trial phone number?
5558
Currently the trial phone number can be verified for up to three recipient phone numbers. If you need to make calls to more numbers, then you'll need to [purchase a phone number](../../quickstarts/telephony/get-phone-number.md)

articles/communication-services/concepts/troubleshooting-info.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -162,22 +162,22 @@ Console.WriteLine($"Email operation id = {emailSendOperation.Id}");
162162
# [JavaScript](#tab/javascript)
163163

164164
The Azure Communication Services Calling SDK relies internally on [@azure/logger](https://www.npmjs.com/package/@azure/logger) library to control logging.
165-
Use the `setLogLevel` method from the `@azure/logger` package to configure the log output:
165+
Use the `setLogLevel` method from the `@azure/logger` package to configure the log output level. Create a logger and pass it into the CallClient constructor:
166166

167167
```javascript
168-
import { setLogLevel } from '@azure/logger';
168+
import { setLogLevel, createClientLogger, AzureLogger } from '@azure/logger';
169169
setLogLevel('verbose');
170-
const callClient = new CallClient();
170+
let logger = createClientLogger('ACS');
171+
const callClient = new CallClient({ logger });
171172
```
172173

173174
You can use AzureLogger to redirect the logging output from Azure SDKs by overriding the `AzureLogger.log` method:
174175
This value may be useful if you want to redirect logs to a location other than console.
175176

176177
```javascript
177-
import { AzureLogger } from '@azure/logger';
178178
// redirect log output
179179
AzureLogger.log = (...args) => {
180-
console.log(...args); // to console, file, buffer, REST API..
180+
console.log(...args); // to console, file, buffer, REST API, etc...
181181
};
182182
```
183183

articles/communication-services/samples/chat-hero-sample.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ In this Sample quickstart, we'll learn how the sample works before we run the sa
2626

2727
## Overview
2828

29-
The sample has both a client-side application and a server-side application. The **client-side application** is a React/Redux web application that uses Microsoft's Fluent UI framework. This application sends requests to an ASP.NET Core **server-side application** that helps the client-side application connect to Azure.
29+
The sample has both a client-side application and a server-side application. The **client-side application** is a React/Redux web application that uses Microsoft's Fluent UI framework. This application sends requests to a Node.js **server-side application** that helps the client-side application connect to Azure.
3030

3131
Here's what the sample looks like:
3232

articles/data-factory/connector-salesforce.md

Lines changed: 52 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -164,6 +164,57 @@ The following properties are supported for the Salesforce linked service.
164164
}
165165
```
166166

167+
**Example: Store credentials in Key Vault, as well as environmentUrl and username**
168+
169+
Note that by doing so, you will no longer be able to use the UI to edit settings. The ***Specify dynamic contents in JSON format*** checkbox will be checked, and you will have to edit this configuration entirely by hand. The advantage is you can derive ALL configuration settings from the Key Vault instead of parameterizing anything here.
170+
171+
```json
172+
{
173+
"name": "SalesforceLinkedService",
174+
"properties": {
175+
"type": "Salesforce",
176+
"typeProperties": {
177+
"environmentUrl": {
178+
"type": "AzureKeyVaultSecret",
179+
"secretName": "<secret name of environment URL in AKV>",
180+
"store": {
181+
"referenceName": "<Azure Key Vault linked service>",
182+
"type": "LinkedServiceReference"
183+
},
184+
},
185+
"username": {
186+
"type": "AzureKeyVaultSecret",
187+
"secretName": "<secret name of username in AKV>",
188+
"store": {
189+
"referenceName": "<Azure Key Vault linked service>",
190+
"type": "LinkedServiceReference"
191+
},
192+
},
193+
"password": {
194+
"type": "AzureKeyVaultSecret",
195+
"secretName": "<secret name of password in AKV>",
196+
"store":{
197+
"referenceName": "<Azure Key Vault linked service>",
198+
"type": "LinkedServiceReference"
199+
}
200+
},
201+
"securityToken": {
202+
"type": "AzureKeyVaultSecret",
203+
"secretName": "<secret name of security token in AKV>",
204+
"store":{
205+
"referenceName": "<Azure Key Vault linked service>",
206+
"type": "LinkedServiceReference"
207+
}
208+
}
209+
},
210+
"connectVia": {
211+
"referenceName": "<name of Integration Runtime>",
212+
"type": "IntegrationRuntimeReference"
213+
}
214+
}
215+
}
216+
```
217+
167218
## Dataset properties
168219

169220
For a full list of sections and properties available for defining datasets, see the [Datasets](concepts-datasets-linked-services.md) article. This section provides a list of properties supported by the Salesforce dataset.
@@ -381,4 +432,4 @@ To learn details about the properties, check [Lookup activity](control-flow-look
381432

382433

383434
## Next steps
384-
For a list of data stores supported as sources and sinks by the copy activity, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).
435+
For a list of data stores supported as sources and sinks by the copy activity, see [Supported data stores](copy-activity-overview.md#supported-data-stores-and-formats).

articles/data-factory/continuous-integration-delivery-sample-script.md

Lines changed: 39 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ Install the latest Azure PowerShell modules by following instructions in [How to
2424
>[!WARNING]
2525
>Make sure to use **PowerShell Core** in ADO task to run the script
2626
27-
## Pre- and post-deployment script
27+
## Pre- and post-deployment script
2828
The sample scripts to stop/ start triggers and update global parameters during release process (CICD) are located in the [Azure Data Factory Official GitHub page](https://github.com/Azure/Azure-DataFactory/tree/main/SamplesV2/ContinuousIntegrationAndDelivery).
2929

3030
> [!NOTE]
@@ -36,12 +36,12 @@ The sample scripts to stop/ start triggers and update global parameters during r
3636
The following sample script can be used to stop triggers before deployment and restart them afterward. The script also includes code to delete resources that have been removed. Save the script in an Azure DevOps git repository and reference it via an Azure PowerShell task the latest Azure PowerShell version.
3737

3838

39-
When running a pre-deployment script, you will need to specify a variation of the following parameters in the **Script Arguments** field.
39+
When running a predeployment script, you need to specify a variation of the following parameters in the **Script Arguments** field.
4040

4141
`-armTemplate "$(System.DefaultWorkingDirectory)/<your-arm-template-location>" -ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name> -predeployment $true -deleteDeployment $false`
4242

4343

44-
When running a post-deployment script, you will need to specify a variation of the following parameters in the **Script Arguments** field.
44+
When running a postdeployment script, you need to specify a variation of the following parameters in the **Script Arguments** field.
4545

4646
`-armTemplate "$(System.DefaultWorkingDirectory)/<your-arm-template-location>" -ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name> -predeployment $false -deleteDeployment $true`
4747

@@ -50,6 +50,42 @@ When running a post-deployment script, you will need to specify a variation of t
5050
5151
:::image type="content" source="media/continuous-integration-delivery/continuous-integration-image11.png" alt-text="Azure PowerShell task":::
5252

53+
## Script execution and parameters - YAML Pipelines
54+
The following YAML code executes a script that can be used to stop triggers before deployment and restart them afterward. The script also includes code to delete resources that have been removed. If you're following the steps outlined in [New CI/CD Flow](continuous-integration-delivery-improvements.md), this script is exported as part of artifact created via the npm publish package.
55+
56+
### Stop ADF Triggers
57+
```
58+
- task: AzurePowerShell@5
59+
displayName: Stop ADF Triggers
60+
inputs:
61+
scriptType: 'FilePath'
62+
ConnectedServiceNameARM: AzureDevServiceConnection
63+
scriptPath: ../ADFTemplates/PrePostDeploymentScript.ps1
64+
ScriptArguments: -armTemplate "<your-arm-template-location>" -ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name> -predeployment $true -deleteDeployment $false
65+
errorActionPreference: stop
66+
FailOnStandardError: False
67+
azurePowerShellVersion: azurePowerShellVersion
68+
preferredAzurePowerShellVersion: 3.1.0
69+
pwsh: False
70+
workingDirectory: ../
71+
```
72+
73+
### Start ADF Triggers
74+
```
75+
- task: AzurePowerShell@5
76+
displayName: Start ADF Triggers
77+
inputs:
78+
scriptType: 'FilePath'
79+
ConnectedServiceNameARM: AzureDevServiceConnection
80+
scriptPath: ../ADFTemplates/PrePostDeploymentScript.ps1
81+
ScriptArguments: -armTemplate "<your-arm-template-location>" -ResourceGroupName <your-resource-group-name> -DataFactoryName <your-data-factory-name>-predeployment $false -deleteDeployment $true
82+
errorActionPreference: stop
83+
FailOnStandardError: False
84+
azurePowerShellVersion: azurePowerShellVersion
85+
preferredAzurePowerShellVersion: 3.1.0
86+
pwsh: False
87+
workingDirectory: ../
88+
```
5389

5490
## Next steps
5591

articles/data-factory/how-to-create-event-trigger.md

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ This section shows you how to create a storage event trigger within the Azure Da
6666

6767
1. Select whether or not your trigger ignores blobs with zero bytes.
6868

69-
1. After you configure you trigger, click on **Next: Data preview**. This screen shows the existing blobs matched by your storage event trigger configuration. Make sure you've specific filters. Configuring filters that are too broad can match a large number of files created/deleted and may significantly impact your cost. Once your filter conditions have been verified, click **Finish**.
69+
1. After you configure your trigger, click on **Next: Data preview**. This screen shows the existing blobs matched by your storage event trigger configuration. Make sure you've specific filters. Configuring filters that are too broad can match a large number of files created/deleted and may significantly impact your cost. Once your filter conditions have been verified, click **Finish**.
7070

7171
:::image type="content" source="media/how-to-create-event-trigger/event-based-trigger-image-3.png" alt-text="Screenshot of storage event trigger preview page.":::
7272

@@ -78,9 +78,6 @@ This section shows you how to create a storage event trigger within the Azure Da
7878

7979
In the preceding example, the trigger is configured to fire when a blob path ending in .csv is created in the folder _event-testing_ in the container _sample-data_. The **folderPath** and **fileName** properties capture the location of the new blob. For example, when MoviesDB.csv is added to the path sample-data/event-testing, `@triggerBody().folderPath` has a value of `sample-data/event-testing` and `@triggerBody().fileName` has a value of `moviesDB.csv`. These values are mapped, in the example, to the pipeline parameters `sourceFolder` and `sourceFile`, which can be used throughout the pipeline as `@pipeline().parameters.sourceFolder` and `@pipeline().parameters.sourceFile` respectively.
8080

81-
> [!NOTE]
82-
> If you are creating your pipeline and trigger in [Azure Synapse Analytics](../synapse-analytics/overview-what-is.md), you must use `@trigger().outputs.body.fileName` and `@trigger().outputs.body.folderPath` as parameters. Those two properties capture blob information. Use those properties instead of using `@triggerBody().fileName` and `@triggerBody().folderPath`.
83-
8481
1. Click **Finish** once you are done.
8582

8683
## JSON schema

articles/data-factory/how-to-send-notifications-to-teams.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -329,7 +329,7 @@ Before you can send notifications to Teams from your pipelines, you must create
329329
"targets": [
330330
{
331331
"os": "default",
332-
"uri": "@{concat('https://synapse.azure.com/monitoring/pipelineruns/',pipeline().parameters.runId,'?factory=/subscriptions/',pipeline().parameters.subscription,'/resourceGroups/',pipeline().parameters.resourceGroup,'/providers/Microsoft.DataFactory/factories/',pipeline().DataFactory)}"
332+
"uri": "@{concat('https://web.azuresynapse.net/monitoring/pipelineruns/',pipeline().parameters.runId,'?workspace=%2Fsubscriptions%2F',pipeline().parameters.subscription,'%2FresourceGroups%2F',pipeline().parameters.resourceGroup,'%2Fproviders%2FMicrosoft.Synapse%2Fworkspaces%2F',pipeline().DataFactory)}"
333333
}
334334
]
335335
}

articles/data-factory/tutorial-incremental-copy-change-data-capture-feature-portal.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ In this tutorial, you create a pipeline that performs the following operations:
4545
If you don't have an Azure subscription, create a [free](https://azure.microsoft.com/free/) account before you begin.
4646

4747
## Prerequisites
48-
* **Azure SQL Database Managed Instance**. You use the database as the **source** data store. If you don't have an Azure SQL Database Managed Instance, see the [Create an Azure SQL Database Managed Instance](/azure/azure-sql/managed-instance/instance-create-quickstart) article for steps to create one.
48+
* **Azure SQL Managed Instance**. You use the database as the **source** data store. If you don't have an Azure SQL Managed Instance, see the [Create an Azure SQL Database Managed Instance](/azure/azure-sql/managed-instance/instance-create-quickstart) article for steps to create one.
4949
* **Azure Storage account**. You use the blob storage as the **sink** data store. If you don't have an Azure storage account, see the [Create a storage account](../storage/common/storage-account-create.md) article for steps to create one. Create a container named **raw**.
5050

5151
### Create a data source table in Azure SQL Database
@@ -76,7 +76,7 @@ If you don't have an Azure subscription, create a [free](https://azure.microsoft
7676
EXEC sys.sp_cdc_enable_table
7777
@source_schema = 'dbo',
7878
@source_name = 'customers',
79-
@role_name = 'null',
79+
@role_name = NULL,
8080
@supports_net_changes = 1
8181
```
8282
5. Insert data into the customers table by running the following command:

0 commit comments

Comments
 (0)