Skip to content

Commit cfc4e2a

Browse files
authored
Merge pull request #99558 from MicrosoftDocs/master
12/20 AM Publish
2 parents 7f24885 + 71bb905 commit cfc4e2a

File tree

58 files changed

+1152
-159
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

58 files changed

+1152
-159
lines changed

articles/active-directory/manage-apps/methods-for-assigning-users-and-groups.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -30,10 +30,11 @@ This article shows you how to assign users or groups to an application in Azure
3030

3131
The availability of group-based assignment is determined by your license agreement. Group-based assignment is supported for Security groups only. Nested group memberships and O365 groups are not currently supported.
3232

33-
## Prerequisites
34-
Before you can assign users and groups to an application, you must require user assignment. To require user assignment:
33+
## Configure the application to require assignment
3534

36-
1. Log in to the Azure portal with an administrator account.
35+
An application can be configured to require assignment before it can be accessed. To require assignment:
36+
37+
1. Log in to the Azure portal with an administrator account, or as an owner of the app under **Enterprise apps**.
3738
2. Click on the **All services** item in the main menu.
3839
3. Choose the directory you are using for the application.
3940
4. Click on the **Enterprise applications** tab.

articles/advisor/advisor-cost-recommendations.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -16,7 +16,7 @@ Advisor helps you optimize and reduce your overall Azure spend by identifying id
1616

1717
## Optimize virtual machine spend by resizing or shutting down underutilized instances
1818

19-
Although certain application scenarios can result in low utilization by design, you can often save money by managing the size and number of your virtual machines. Advisor advanced evaluation models considers a virtual machines for shut-down when P95th of the max of max value of CPU utilization is less than 3% and network utilization is less than 2% over a 7 day period. Virtual machines are considered for right size when it is possible to fit the current load in a smaller SKU (within the same SKU family) or a smaller number # of instance such that the current load doesn’t go over 80% utilization when non-user facing workloads and not above 40% when user-facing workload. Here, the type of workload is determined by analyzing the CPU utilization characteristics of the workload.
19+
Although certain application scenarios can result in low utilization by design, you can often save money by managing the size and number of your virtual machines. Advisor advanced evaluation models considers virtual machines for shut-down when P95th of the max of max value of CPU utilization is less than 3% and network utilization is less than 2% over a 7 day period. Virtual machines are considered for right size when it is possible to fit the current load in a smaller SKU (within the same SKU family) or a smaller number # of instance such that the current load doesn’t go over 80% utilization when non-user facing workloads and not above 40% when user-facing workload. Here, the type of workload is determined by analyzing the CPU utilization characteristics of the workload.
2020

2121
The recommended actions are shut-down or resize, specific to resource being recommended for. Advisor shows you the estimated cost savings for either recommended actions - resize or shut-down. Also, for resize recommended action, Advisor provides current and target SKU information.
2222

@@ -45,13 +45,15 @@ Advisor identifies public IP addresses that are not currently associated to Azur
4545
Azure Advisor will detect Azure Data Factory pipelines that repeatedly fail and recommend that you resolve the issues or delete the failing pipelines if they are no longer needed. You will be billed for these pipelines even if though they are not serving you while they are failing.
4646

4747
## Use Standard Snapshots for Managed Disks
48-
To save 60% of cost, we recommend storing your snapshots in Standard Storage, regardless of the storage type of the parent disk. This is the default option for Managed Disks snapshots. Azure Advisor will identify snapshots that are stored Premium Storage and recommend migrating your snapshot from Premium to Standard Storage. [Learn more about Managed Disk pricing](https://aka.ms/aa_manageddisksnapshot_learnmore)
48+
To save 60% of cost, we recommend storing your snapshots in Standard Storage, regardless of the storage type of the parent disk. This option is the default option for Managed Disks snapshots. Azure Advisor will identify snapshots that are stored Premium Storage and recommend migrating your snapshot from Premium to Standard Storage. [Learn more about Managed Disk pricing](https://aka.ms/aa_manageddisksnapshot_learnmore)
4949

5050
## How to access Cost recommendations in Azure Advisor
5151

52-
1. Sign in to the [Azure portal](https://portal.azure.com), and then open [Advisor](https://aka.ms/azureadvisordashboard).
52+
1. Sign in to the [Azure portal](https://portal.azure.com).
5353

54-
2. On the Advisor dashboard, click the **Cost** tab.
54+
1. Search for and select [**Advisor**](https://aka.ms/azureadvisordashboard) from any page.
55+
56+
1. On the **Advisor** dashboard, select the **Cost** tab.
5557

5658
## Next steps
5759

articles/api-management/api-management-role-based-access-control.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -24,17 +24,17 @@ Azure API Management relies on Azure Role-Based Access Control (RBAC) to enable
2424

2525
## Built-in roles
2626

27-
API Management currently provides three built-in roles and will add two more roles in the near future. These roles can be assigned at different scopes, including subscription, resource group, and individual API Management instance. For instance, if you assign the "Azure API Management Service Reader" role to a user at the resource-group level, then the user has read access to all API Management instances inside the resource group.
27+
API Management currently provides three built-in roles and will add two more roles in the near future. These roles can be assigned at different scopes, including subscription, resource group, and individual API Management instance. For instance, if you assign the "API Management Service Reader" role to a user at the resource-group level, then the user has read access to all API Management instances inside the resource group.
2828

2929
The following table provides brief descriptions of the built-in roles. You can assign these roles by using the Azure portal or other tools, including Azure [PowerShell](https://docs.microsoft.com/azure/role-based-access-control/role-assignments-powershell), [Azure CLI](https://docs.microsoft.com/azure/role-based-access-control/role-assignments-cli), and [REST API](https://docs.microsoft.com/azure/role-based-access-control/role-assignments-rest). For details about how to assign built-in roles, see [Use role assignments to manage access to your Azure subscription resources](https://docs.microsoft.com/azure/role-based-access-control/role-assignments-portal).
3030

3131
| Role | Read access<sup>[1]</sup> | Write access<sup>[2]</sup> | Service creation, deletion, scaling, VPN, and custom domain configuration | Access to the legacy publisher portal | Description
3232
| ------------- | ---- | ---- | ---- | ---- | ----
33-
| Azure API Management Service Contributor ||||| Super user. Has full CRUD access to API Management services and entities (for example, APIs and policies). Has access to the legacy publisher portal. |
34-
| Azure API Management Service Reader || | || Has read-only access to API Management services and entities. |
35-
| Azure API Management Service Operator || || | Can manage API Management services, but not entities.|
36-
| Azure API Management Service Editor<sup>*</sup> ||| | | Can manage API Management entities, but not services.|
37-
| Azure API Management Content Manager<sup>*</sup> || | || Can manage the developer portal. Read-only access to services and entities.|
33+
| API Management Service Contributor ||||| Super user. Has full CRUD access to API Management services and entities (for example, APIs and policies). Has access to the legacy publisher portal. |
34+
| API Management Service Reader || | || Has read-only access to API Management services and entities. |
35+
| API Management Service Operator || || | Can manage API Management services, but not entities.|
36+
| API Management Service Editor<sup>*</sup> ||| | | Can manage API Management entities, but not services.|
37+
| API Management Content Manager<sup>*</sup> || | || Can manage the developer portal. Read-only access to services and entities.|
3838

3939
<sup>[1] Read access to API Management services and entities (for example, APIs and policies).</sup>
4040

@@ -49,7 +49,7 @@ If none of the built-in roles meet your specific needs, custom roles can be crea
4949
> [!NOTE]
5050
> To be able to see an API Management instance in the Azure portal, a custom role must include the ```Microsoft.ApiManagement/service/read``` action.
5151
52-
When you create a custom role, it's easier to start with one of the built-in roles. Edit the attributes to add **Actions**, **NotActions**, or **AssignableScopes**, and then save the changes as a new role. The following example begins with the "Azure API Management Service Reader" role and creates a custom role called "Calculator API Editor." You can assign the custom role to a specific API. Consequently, this role only has access to that API.
52+
When you create a custom role, it's easier to start with one of the built-in roles. Edit the attributes to add **Actions**, **NotActions**, or **AssignableScopes**, and then save the changes as a new role. The following example begins with the "API Management Service Reader" role and creates a custom role called "Calculator API Editor." You can assign the custom role to a specific API. Consequently, this role only has access to that API.
5353

5454
```powershell
5555
$role = Get-AzRoleDefinition "API Management Service Reader Role"
@@ -79,4 +79,4 @@ To learn more about Role-Based Access Control in Azure, see the following articl
7979
* [Get started with access management in the Azure portal](../role-based-access-control/overview.md)
8080
* [Use role assignments to manage access to your Azure subscription resources](../role-based-access-control/role-assignments-portal.md)
8181
* [Custom roles in Azure RBAC](../role-based-access-control/custom-roles.md)
82-
* [Azure Resource Manager resource provider operations](../role-based-access-control/resource-provider-operations.md#microsoftapimanagement)
82+
* [Azure Resource Manager resource provider operations](../role-based-access-control/resource-provider-operations.md#microsoftapimanagement)

articles/azure-functions/functions-versions.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -111,7 +111,7 @@ The version of the Functions runtime used by published apps in Azure is dictated
111111
112112
### Locally developed application versions
113113

114-
You can make the following updates function apps locally change targeted versions.
114+
You can make the following updates to function apps to locally change the targeted versions.
115115

116116
#### Visual Studio runtime versions
117117

articles/cognitive-services/Speech-Service/batch-transcription.md

Lines changed: 26 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -78,29 +78,26 @@ Configuration parameters are provided as JSON:
7878
}
7979
```
8080

81-
> [!NOTE]
82-
> The Batch Transcription API uses a REST service for requesting transcriptions, their status, and associated results. You can use the API from any language. The next section describes how the API is used.
83-
8481
### Configuration properties
8582

8683
Use these optional properties to configure transcription:
8784

8885
| Parameter | Description |
89-
|-----------|-------------|
90-
| `ProfanityFilterMode` | Specifies how to handle profanity in recognition results. Accepted values are `None`, which disables profanity filtering; `Masked`, which replaces profanity with asterisks; `Removed`, which removes all profanity from the result; `Tags`, which adds "profanity" tags. The default setting is `Masked`. |
91-
| `PunctuationMode` | Specifies how to handle punctuation in recognition results. Accepted values are `None`, which disables punctuation; `Dictated`, which implies explicit punctuation; `Automatic`, which lets the decoder deal with punctuation; `DictatedAndAutomatic`, which implies dictated punctuation marks or automatic (the default value). |
92-
| `AddWordLevelTimestamps` | Specifies if word level timestamps should be added to the output. Accepted values are `True`, which enables word level timestamps; `False`, to disable it (the default value). |
93-
| `AddSentiment` | Specifies sentiment should be added to the utterance. Accepted values are `True`, which enables sentiment per utterance; `False`, to disable it (the default value).|
94-
| `AddDiarization` | Specifies that diarization analysis should be carried out on the input that is expected to be mono channel containing two voices. Accepted values are `True`, which enables diarization; `False`, to disable it (the default value). It also requires `AddWordLevelTimestamps` to be set to true.|
95-
| `TranscriptionResultsContainerUrl` | An optional SAS token to a writeable container in Azure. The result will be stored in this container. |
86+
|-----------|------------|
87+
|`ProfanityFilterMode`|Specifies how to handle profanity in recognition results
88+
||**`Masked`** - Default. Replaces profanity with asterisks<br>`None` - Disables profanity filtering<br>`Removed` - Removes all profanity from the result<br>`Tags` - Adds profanity tags
89+
|`PunctuationMode`|Specifies to handle punctuation in recognition results
90+
||`Automatic` - The service inserts punctuation<br>`Dictated` - Dictated (spoken) punctuation<br>**`DictatedAndAutomatic`** - Default. Dictated and automatic punctuation<br>`None` - Disables punctuation
91+
|`AddWordLevelTimestamps`|Specifies if word level timestamps should be added to the output
92+
||`True` - Enables word level timestamps<br>**`False`** - Default. Disable word level timestamps
93+
|`AddSentiment`|Specifies if sentiment analysis is added to the utterance
94+
||`True` - Enables sentiment per utterance<br>**`False`** - Default. Disable sentiment
95+
|`AddDiarization`|Specifies if diarization analysis is carried out. If `true`, the input is expected to be mono channel audio containing a maximum of two voices. `AddWordLevelTimestamps` needs to be set to `true`
96+
||`True` - Enables diarization<br>**`False`** - Default. Disable diarization
97+
|`TranscriptionResultsContainerUrl`|Optional SAS token to a writeable container in Azure. The result will be stored in this container
9698

9799
### Storage
98100

99-
Lexical` | The lexical form of the recognized text: the actual words recognized. |
100-
| `ITN` | The inverse-text-normalized ("canonical") form of the recognized text, with phone numbers, numbers, abbreviations ("doctor smith" to "dr smith"), and other transformations applied. |
101-
| `MaskedITN` | The ITN form with profanity masking applied, if requested. |
102-
| `Display` | The display form of the recognized text, with punctuation and capitalization added. This parameter is the same as `DisplayText` provided when format is set to `simple`. |
103-
104101
Batch transcription supports [Azure Blob storage](https://docs.microsoft.com/azure/storage/blobs/storage-blobs-overview) for reading audio and writing transcriptions to storage.
105102

106103
## The batch transcription result
@@ -117,14 +114,10 @@ For mono input audio, one transcription result file is being created. For stereo
117114
"CombinedResults": [
118115
{
119116
"ChannelNumber": null 'always null'
120-
"Lexical": string 'the actual words recognized'
121-
"ITN": string 'inverse-text-normalized form
122-
of the recognized text, with phone numbers,
123-
abbreviations ("doctor smith" to "dr smith"),
124-
and other transformations applied'
125-
"MaskedITN": string 'The ITN form with profanity masking applied'
126-
"Display": string 'The display form of the recognized text
127-
with punctuation and capitalization added'
117+
"Lexical": string
118+
"ITN": string
119+
"MaskedITN": string
120+
"Display": string
128121
}
129122
]
130123
SegmentResults:[ 'for each individual segment'
@@ -148,7 +141,7 @@ For mono input audio, one transcription result file is being created. For stereo
148141
"MaskedITN": string
149142
"Display": string
150143
"Sentiment":
151-
{ 'this is ommitted if sentiment is
144+
{ 'this is omitted if sentiment is
152145
not requested'
153146
"Negative": number 'between 0 and 1'
154147
"Neutral": number 'between 0 and 1'
@@ -172,6 +165,15 @@ For mono input audio, one transcription result file is being created. For stereo
172165
}
173166
```
174167

168+
The result contains these forms:
169+
170+
|Form|Content|
171+
|-|-|
172+
|`Lexical`|The actual words recognized.
173+
|`ITN`|Inverse-text-normalized form of the recognized text. Abbreviations ("doctor smith" to "dr smith"), phone numbers, and other transformations are applied.
174+
|`MaskedITN`|The ITN form with profanity masking applied.
175+
|`Display`|The display form of the recognized text. This includes added punctuation and capitalization.
176+
175177
## Speaker separation (Diarization)
176178

177179
Diarization is the process of separating speakers in a piece of audio. Our Batch pipeline supports diarization and is capable of recognizing two speakers on mono channel recordings. The feature is not available on stereo recordings.

articles/cosmos-db/how-to-use-stored-procedures-triggers-udfs.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -139,7 +139,7 @@ The following example shows how to register a stored procedure by using the Java
139139
```javascript
140140
const container = client.database("myDatabase").container("myContainer");
141141
const sprocId = "spCreateToDoItem";
142-
await container.storedProcedures.create({
142+
await container.scripts.storedProcedures.create({
143143
id: sprocId,
144144
body: require(`../js/${sprocId}`)
145145
});
@@ -156,7 +156,7 @@ const newItem = [{
156156
}];
157157
const container = client.database("myDatabase").container("myContainer");
158158
const sprocId = "spCreateToDoItem";
159-
const {body: result} = await container.storedProcedure(sprocId).execute(newItem, {partitionKey: newItem[0].category});
159+
const {body: result} = await container.scripts.storedProcedure(sprocId).execute(newItem, {partitionKey: newItem[0].category});
160160
```
161161

162162
### Stored procedures - Python SDK

articles/cosmos-db/sql-query-order-by.md

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -45,6 +45,9 @@ ORDER BY <sort_specification>
4545
## Remarks
4646

4747
The ORDER BY clause requires that the indexing policy include an index for the fields being sorted. The Azure Cosmos DB query runtime supports sorting against a property name and not against computed properties. Azure Cosmos DB supports multiple ORDER BY properties. In order to run a query with multiple ORDER BY properties, you should define a [composite index](index-policy.md#composite-indexes) on the fields being sorted.
48+
49+
> [!Note]
50+
> When using the .NET SDK 3.4.0 or above, if the properties being sorted against might be undefined for some documents then you need to explicitly create an index on those properties. The default indexing policy will not allow for the retrieval of the documents where the sort property is undefined.
4851
4952
## Examples
5053

articles/data-factory/concepts-data-flow-overview.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ms.reviewer: daperlov
77
ms.service: data-factory
88
ms.topic: conceptual
99
ms.custom: seo-lt-2019
10-
ms.date: 10/7/2019
10+
ms.date: 12/19/2019
1111
---
1212

1313
# What are mapping data flows?
@@ -56,6 +56,8 @@ If you execute data flows in a pipeline in parallel, ADF will spin-up separate A
5656

5757
Of these three options, this option will likely execute in the shortest amount of time. However, each parallel data flow will execute at the same time on separate clusters, so the ordering of events is non-deterministic.
5858

59+
If you are executing your data flow activities in parallel inside your pipelines, it is recommended to not use TTL. This is because parallel executions of data flows simultaneously using the same Azure Integration Runtime will result in multiple warm pool instances for your data factory.
60+
5961
##### Overload single data flow
6062

6163
If you put all of your logic inside a single data flow, ADF will all execute in that same job execution context on a single Spark cluster instance.

0 commit comments

Comments
 (0)