Skip to content

Commit fa2f5c7

Browse files
authored
Merge pull request #220506 from MicrosoftDocs/repo_sync_working_branch
Confirm merge from repo_sync_working_branch to main to sync with https://github.com/MicrosoftDocs/azure-docs (branch main)
2 parents 6d7b880 + e1cf6be commit fa2f5c7

File tree

7 files changed

+10
-10
lines changed

7 files changed

+10
-10
lines changed

articles/active-directory/conditional-access/workload-identity.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ ms.collection: M365-identity-device-management
1717
---
1818
# Conditional Access for workload identities
1919

20-
Conditional Access policies have histroically applied only to users when they access apps and services like SharePoint online or the Azure portal. We are now extending support for Conditional Access policies to be applied to service principals owned by the organization. We call this capability Conditional Access for workload identities.
20+
Conditional Access policies have historically applied only to users when they access apps and services like SharePoint online or the Azure portal. We are now extending support for Conditional Access policies to be applied to service principals owned by the organization. We call this capability Conditional Access for workload identities.
2121

2222
A [workload identity](../develop/workload-identities-overview.md) is an identity that allows an application or service principal access to resources, sometimes in the context of a user. These workload identities differ from traditional user accounts as they:
2323

articles/azure-functions/configure-monitoring.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -416,7 +416,7 @@ To configure these values at App settings level (and avoid redeployment on just
416416
| Host.json path | App setting |
417417
|----------------|-------------|
418418
| logging.logLevel.default | AzureFunctionsJobHost__logging__logLevel__default |
419-
| logging.logLevel.Host.Aggregator | AzureFunctionsJobHost__logging__logLevel__Host__Aggregator |
419+
| logging.logLevel.Host.Aggregator | AzureFunctionsJobHost__logging__logLevel__Host.Aggregator |
420420
| logging.logLevel.Function | AzureFunctionsJobHost__logging__logLevel__Function |
421421
| logging.logLevel.Function.Function1 | AzureFunctionsJobHost__logging__logLevel__Function.Function1 |
422422
| logging.logLevel.Function.Function1.User | AzureFunctionsJobHost__logging__logLevel__Function.Function1.User |

articles/azure-functions/functions-develop-vs-code.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ The Functions extension lets you create a function app project, along with your
113113

114114
:::image type="content" source="./media/functions-develop-vs-code/create-function-auth.png" alt-text="Screenshot for creating function authorization.":::
115115

116-
1. From the dropdown list, select **Add to workplace**.
116+
1. From the dropdown list, select **Add to workspace**.
117117

118118
:::image type="content" source="./media/functions-develop-vs-code/add-to-workplace.png" alt-text=" Screenshot for selectIng Add to workplace.":::
119119

articles/cosmos-db/introduction.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ You can [Try Azure Cosmos DB for Free](https://azure.microsoft.com/try/cosmosdb/
4545
Gain unparalleled [SLA-backed](https://azure.microsoft.com/support/legal/sla/cosmos-db) speed and throughput, fast global access, and instant elasticity.
4646

4747
- Real-time access with fast read and write latencies globally, and throughput and consistency all backed by [SLAs](https://azure.microsoft.com/support/legal/sla/cosmos-db)
48-
- Multi-region writes and data distribution to any Azure region with the just a button.
48+
- Multi-region writes and data distribution to any Azure region with just a button.
4949
- Independently and elastically scale storage and throughput across any Azure region – even during unpredictable traffic bursts – for unlimited scale worldwide.
5050

5151
### Simplified application development

articles/defender-for-cloud/attack-path-reference.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ Prerequisite: For a list of prerequisites see the [Availability](how-to-manage-a
3232
| VM has high severity vulnerabilities and read permission to a Key Vault | Virtual machine '\[MachineName]' has high severity vulnerabilities \[RCE] and \[IdentityDescription] with read permission to Key Vault '\[KVName]' |
3333
| VM has high severity vulnerabilities and read permission to a data store | Virtual machine '\[MachineName]' has high severity vulnerabilities \[RCE] and \[IdentityDescription] with read permission to \[DatabaseType] '\[DatabaseName]' |
3434

35-
### AWS VMs
35+
### AWS Instances
3636

3737
Prerequisite: [Enable agentless scanning](enable-vulnerability-assessment-agentless.md).
3838

@@ -43,7 +43,7 @@ Prerequisite: [Enable agentless scanning](enable-vulnerability-assessment-agentl
4343
| Internet exposed EC2 instance has high severity vulnerabilities and read permission to S3 bucket | Option 1 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has high severity vulnerabilities\[RCE] and has IAM role attached with '\[Rolepermission]' permission via IAM policy to S3 bucket '\[BucketName]' <br> <br> Option 2 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has high severity vulnerabilities\[RCE] and has IAM role attached with '\[S3permission]' permission via bucket policy to S3 bucket '\[BucketName]' <br> <br> Option 3 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has high severity vulnerabilities\[RCE] and has IAM role attached with '\[Rolepermission]' permission via IAM policy and '\[S3permission]' permission via bucket policy to S3 bucket '\[BucketName]'|
4444
| Internet exposed EC2 instance has high severity vulnerabilities and read permission to a S3 bucket with sensitive data | Option 1 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has high severity vulnerabilities\[RCE] and has IAM role attached with '\[Rolepermission]' permission via IAM policy to S3 bucket '\[BucketName]' containing sensitive data <br> <br> Option 2 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has high severity vulnerabilities\[RCE] and has IAM role attached with '\[S3permission]' permission via bucket policy to S3 bucket '\[BucketName]' containing sensitive data <br> <br> Option 3 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has high severity vulnerabilities\[RCE] and has IAM role attached with '\[Rolepermission]' permission via IAM policy and '\[S3permission] permission via bucket policy to S3 bucket '\[BucketName]' containing sensitive data <br><br> . For more details, you can learn how to [prioritize security actions by data sensitivity](./information-protection.md). |
4545
| Internet exposed EC2 instance has high severity vulnerabilities and read permission to a KMS | Option 1 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has high severity vulnerabilities\[RCE] and has IAM role attached with '\[Rolepermission]' permission via IAM policy to AWS Key Management Service (KMS) '\[KeyName]' <br> <br> Option 2 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has vulnerabilities allowing remote code execution and has IAM role attached with '\[Keypermission]' permission via AWS Key Management Service (KMS) policy to key '\[KeyName]' <br> <br> Option 3 <br> AWS EC2 instance '\[MachineName]' is reachable from the internet, has vulnerabilities allowing remote code execution and has IAM role attached with '\[Rolepermission]' permission via IAM policy and '\[Keypermission] permission via AWS Key Management Service (KMS) policy to key '\[KeyName]' |
46-
| Internet exposed EC2 instance has high severity vulnerabilities | AWS EC2 instance '\[EC2Name]' is reachable from the internet and has high severity vulnerabilities\[RCE] |
46+
| Internet exposed EC2 instance has high severity vulnerabilities | AWS EC2 instance '\[EC2Name]' is reachable from the internet and has high severity vulnerabilities\[RCE] | EC2 instance with high severity vulnerabilities has high privileged permissions to an account | EC2 instance '\[EC2Name]' has high severity vulnerabilities\[RCE] and has '\[Permissions]' permissions to account '\[AccountName]' | EC2 instance with high severity vulnerabilities has read permissions to a data store | Option 1 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has '\[Permissions]' permissions to database '\[DatabaseName]' <br> <br> Option 2 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has IAM role attached which is granted with '\[RolePermissions]' permissions through IAM policy to S3 bucket '\[BucketName]' <br><br> Option 3 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has IAM role attached which is granted with '\[Permissions]' permissions through bucket policy to S3 bucket '\[BucketName]' <br><br> Option 4 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has IAM role attached which is granted with '\[RolePermissions]' permissions through IAM policy and '\[Permissions]' permissions through bucket policy to S3 bucket '\[BucketName]' | EC2 instance with high severity vulnerabilities has read permissions to a data store with sensitive data | Option 1 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has IAM role attached which is granted with '\[RolePermissions]' permissions through IAM policy to S3 bucket '\[BucketName]' containing sensitive data <br><br> Option 2 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has IAM role attached which is granted with '\[Permissions]' permissions through bucket policy to S3 bucket '\[BucketName]' containing sensitive data <br><br> Option 3 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has IAM role attached which is granted with '\[RolePermissions]' permissions through IAM policy and '\[Permissions]' permissions through bucket policy to S3 bucket '\[BucketName]' containing sensitive data | EC2 instance with high severity vulnerabilities has read permissions to a KMS key | Option 1 <br> EC2 instance '\[MachineName]' has high severity vulnerabilities\[RCE] and has IAM role attached which is granted with '\[RolePermissions]' permissions through IAM policy to AWS Key Management Service (KMS) key '\[KeyName]' <br><br> option 2 <br> EC2 instance '\[MachineName]' has vulnerabilities allowing remote code execution and has IAM role attached which is granted with '\[KeyPermissions]' permissions through AWS Key Management Service (KMS) policy to key '\[KeyName]' <br><br> Option 3 <br> EC2 instance '\[MachineName]' has vulnerabilities allowing remote code execution and has IAM role attached which is granted with '\[RolePermissions]' permissions through IAM policy and '\[KeyPermissions]' permissions through AWS Key Management Service (KMS) policy to key '\[KeyName]' |
4747

4848
### Azure data
4949

@@ -118,4 +118,4 @@ This section lists all of the cloud security graph components (connections & in
118118
For related information, see the following:
119119
- [What are the cloud security graph, attack path analysis, and the cloud security explorer?](concept-attack-path.md)
120120
- [Identify and remediate attack paths](how-to-manage-attack-path.md)
121-
- [Cloud security explorer](how-to-manage-cloud-security-explorer.md)
121+
- [Cloud security explorer](how-to-manage-cloud-security-explorer.md)

articles/machine-learning/migrate-to-v2-execution-pipeline.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -155,7 +155,7 @@ This article gives a comparison of scenario(s) in SDK v1 and SDK v2. In the foll
155155
cluster_name = "cpu-cluster"
156156
print(ml_client.compute.get(cluster_name))
157157

158-
# Import components that are defined with python function
158+
# Import components that are defined with Python function
159159
with open("src/components.py") as fin:
160160
print(fin.read())
161161

@@ -173,7 +173,7 @@ This article gives a comparison of scenario(s) in SDK v1 and SDK v2. In the foll
173173
# define a pipeline with component
174174
@pipeline(default_compute=cluster_name)
175175
def pipeline_with_python_function_components(input_data, test_data, learning_rate):
176-
"""E2E dummy train-score-eval pipeline with components defined via python function components"""
176+
"""E2E dummy train-score-eval pipeline with components defined via Python function components"""
177177

178178
# Call component obj as function: apply given inputs & parameters to create a node in pipeline
179179
train_with_sample_data = train_model(

articles/private-link/tutorial-private-endpoint-storage-portal.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ Create a virtual network, subnet, and bastion host. The virtual network and subn
3636

3737
The bastion host will be used to connect securely to the virtual machine for testing the private endpoint.
3838

39-
1. In the search box at the top of the portal, enter **Virtual network**. Select **Virtual networks** in the search results.
39+
1. In the search box at the top of the portal, enter **Virtual network**. Select **Virtual network** in the search results.
4040

4141
2. Select **+ Create**.
4242

0 commit comments

Comments
 (0)