Skip to content

Commit 1c406ef

Browse files
committed
2 parents 62c4608 + 63386b0 commit 1c406ef

File tree

8 files changed

+21
-17
lines changed

8 files changed

+21
-17
lines changed

articles/communication-services/concepts/privacy.md

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -38,9 +38,6 @@ The list of geographies you can choose from includes:
3838
- United Kingdom
3939
- United States
4040

41-
> [!NOTE]
42-
> For PSTN & SMS, call and message data records required for the operation and billing of the service, may be stored in the United States.
43-
4441
## Data collection
4542

4643
Azure Communication Services only collects diagnostic data required to deliver the service.

articles/communication-services/concepts/telephony/direct-routing-infrastructure.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -106,8 +106,8 @@ The SBC makes a DNS query to resolve sip.pstnhub.microsoft.com. Based on the SBC
106106

107107
## Media traffic: IP and Port ranges
108108

109-
The media traffic flows to and from a separate service in the Microsoft Cloud called Media Processor. The IP address range for media traffic:
110-
- `20.202.0.0/16 (IP addresses from 20.202.0.1 to 20.202.255.254)`
109+
The media traffic flows to and from a separate service called Media Processor. At the moment of publishing, Media Processor for Communication Services can use any Azure IP address.
110+
Download [the full list of addresses](https://www.microsoft.com/download/details.aspx?id=56519).
111111

112112
### Port ranges
113113
The port ranges of the Media Processors are shown in the following table:

articles/confidential-ledger/overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,4 +64,4 @@ The Functional APIs allow direct interaction with your instantiated confidential
6464
- [Microsoft Azure confidential ledger architecture](architecture.md)
6565
- [Quickstart: Azure portal](quickstart-portal.md)
6666
- [Quickstart: Python](quickstart-python.md)
67-
- [Quickstart: Azure Resource Manager (ARM) template](quickstart-portal.md)
67+
- [Quickstart: Azure Resource Manager (ARM) template](quickstart-template.md)

articles/defender-for-cloud/security-policy-concept.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ There are different types of policies in Azure Policy. Defender for Cloud mainly
1919

2020
## What is a security initiative?
2121

22-
An Azure Policy initiative is a collection of Azure Policy definitions, or rules, that are grouped together towards a specific goal or purpose. Azure initiatives simplify management of your policies by grouping a set of policies together, logically, as a single item.
22+
A security initiative is a collection of Azure Policy definitions, or rules, that are grouped together towards a specific goal or purpose. Security initiatives simplify management of your policies by grouping a set of policies together, logically, as a single item.
2323

2424
A security initiative defines the desired configuration of your workloads and helps ensure you're complying with the security requirements of your company or regulators.
2525

articles/sentinel/sap/deploy-data-connector-agent-container.md

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -121,23 +121,30 @@ If you're not using SNC, then your SAP configuration and authentication secrets
121121
122122
1. **Create and configure a data disk** to be mounted at the Docker root directory.
123123
124-
1. Run the following command to **download and run the deployment Kickstart script**:
125-
124+
1. **download and run the deployment Kickstart script**:
125+
For public cloud, the command is:
126126
```bash
127127
wget -O sapcon-sentinel-kickstart.sh https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/sapcon-sentinel-kickstart.sh && bash ./sapcon-sentinel-kickstart.sh
128128
```
129-
129+
For Azure China 21Vianet, the command is:
130+
```bash
131+
wget -O sapcon-sentinel-kickstart.sh https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/sapcon-sentinel-kickstart.sh && bash ./sapcon-sentinel-kickstart.sh --cloud mooncake
132+
```
133+
For Azure Government - US, the command is:
134+
```bash
135+
wget -O sapcon-sentinel-kickstart.sh https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/sapcon-sentinel-kickstart.sh && bash ./sapcon-sentinel-kickstart.sh --cloud fairfax
136+
```
130137
The script updates the OS components, installs the Azure CLI and Docker software and other required utilities (jq, netcat, curl), and prompts you for configuration parameter values. You can supply additional parameters to the script to minimize the amount of prompts or to customize the container deployment. For more information on available command line options, see [Kickstart script reference](reference-kickstart.md).
131138
132-
1. **Follow the on-screen instructions** to enter your SAP and key vault details and complete the deployment. When the deployment is complete, a confirmation message is displayed:
139+
2. **Follow the on-screen instructions** to enter your SAP and key vault details and complete the deployment. When the deployment is complete, a confirmation message is displayed:
133140
134141
```bash
135142
The process has been successfully completed, thank you!
136143
```
137144
138145
Note the Docker container name in the script output. You'll use it in the next step.
139146
140-
1. Run the following command to **configure the Docker container to start automatically**.
147+
3. Run the following command to **configure the Docker container to start automatically**.
141148
142149
```bash
143150
docker update --restart unless-stopped <container-name>

articles/sentinel/sap/deployment-solution-configuration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,6 @@ Now Microsoft Sentinel will be able to differentiate a logon from 192.168.10.15
7272
- SAP - Sensitive Tables
7373
- SAP - Sensitive ABAP Programs
7474
- SAP - Sensitive Transactions
75-
- SAP - Critical Authorizations
7675

7776
All of these watchlists identify sensitive actions or data that can be carried out or accessed by users. Several well-known operations, tables and authorizations have been pre-configured in the watchlists, however we recommend you consult with the SAP BASIS team to identify which operations, transactions, authorizations and tables are considered to be sensitive in your SAP environment.
7877

@@ -81,6 +80,7 @@ All of these watchlists identify sensitive actions or data that can be carried o
8180
- SAP - Sensitive Profiles
8281
- SAP - Sensitive Roles
8382
- SAP - Privileged Users
83+
- SAP - Critical Authorizations
8484

8585
The Microsoft Sentinel Solution for SAP uses User Master data gathered from SAP systems to identify which users, profiles, and roles should be considered sensitive. Some sample data is included in the watchlists, though we recommend you consult with the SAP BASIS team to identify sensitive users, roles and profiles and populate the watchlists accordingly.
8686

articles/sentinel/sap/preparing-sap.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -64,19 +64,19 @@ To deploy the CRs, follow the steps outlined below:
6464

6565
1. Transfer the CR files to the SAP system.
6666
Alternatively, you can download the files directly onto the SAP system from the SSH prompt. Use the following commands:
67-
- Download NLPK900202
67+
- Download NPLK900202
6868
```bash
6969
wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/CR/K900202.NPL
7070
wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/CR/R900202.NPL
7171
```
7272

73-
- Download NLPK900201
73+
- Download NPLK900201
7474
```bash
7575
wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/CR/K900201.NPL
7676
wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/CR/R900201.NPL
7777
```
7878

79-
- Download NLPK900271
79+
- Download NPLK900271
8080
```bash
8181
wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/CR/K900271.NPL
8282
wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/Solutions/SAP/CR/R900271.NPL

articles/synapse-analytics/sql/resources-self-help-sql-on-demand.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -854,7 +854,7 @@ There are some limitations and known issues that you might see in Delta Lake sup
854854
- Make sure that you're referencing the root Delta Lake folder in the [OPENROWSET](./develop-openrowset.md) function or external table location.
855855
- The root folder must have a subfolder named `_delta_log`. The query fails if there's no `_delta_log` folder. If you don't see that folder, you're referencing plain Parquet files that must be [converted to Delta Lake](../spark/apache-spark-delta-lake-overview.md?pivots=programming-language-python#convert-parquet-to-delta) by using Apache Spark pools.
856856
- Don't specify wildcards to describe the partition schema. The Delta Lake query automatically identifies the Delta Lake partitions.
857-
- Delta Lake tables created in the Apache Spark pools are automatically available in serverless SQL pool, but the schema is not updated. If you add the columns in hte Delta table using Spark pool, the changes will not be shown in serverless database.
857+
- Delta Lake tables that are created in the Apache Spark pools are automatically available in serverless SQL pool, but the schema is not updated (public preview limitation). If you add columns in the Delta table using a Spark pool, the changes will not be shown in serverless SQL pool database.
858858
- External tables don't support partitioning. Use [partitioned views](create-use-views.md#delta-lake-partitioned-views) on the Delta Lake folder to use the partition elimination. See known issues and workarounds later in the article.
859859
- Serverless SQL pools don't support time travel queries. Use Apache Spark pools in Synapse Analytics to [read historical data](../spark/apache-spark-delta-lake-overview.md?pivots=programming-language-python#read-older-versions-of-data-using-time-travel).
860860
- Serverless SQL pools don't support updating Delta Lake files. You can use serverless SQL pool to query the latest version of Delta Lake. Use Apache Spark pools in Synapse Analytics to [update Delta Lake](../spark/apache-spark-delta-lake-overview.md?pivots=programming-language-python#update-table-data).

0 commit comments

Comments
 (0)