Skip to content

Commit fd8a169

Browse files
committed
Merge branch 'capitals' of https://github.com/ElazarK/azure-docs-pr into capitals
2 parents 0abf953 + a13fa3f commit fd8a169

File tree

4 files changed

+6
-4
lines changed

4 files changed

+6
-4
lines changed

articles/azure-app-configuration/TOC.yml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -193,6 +193,8 @@
193193
href: https://go.microsoft.com/fwlink/?linkid=2074663
194194
- name: Java Spring provider
195195
href: https://go.microsoft.com/fwlink/?linkid=2180917
196+
- name: Python provider
197+
href: https://pypi.org/project/azure-appconfiguration-provider/
196198
- name: Azure SDK for .NET
197199
href: https://go.microsoft.com/fwlink/?linkid=2092056
198200
- name: Azure SDK for Java

articles/defender-for-cloud/concept-cloud-security-posture-management.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ Defender for Cloud continually assesses your resources, subscriptions, and organ
2121

2222
## Defender CSPM plan options
2323

24-
The Defender CSPM plan comes with two options, foundational CSPM capabilities and Defender Cloud Security Posture Management (CSPM). When you deploy Defender for Cloud to your subscription and resources, you'll automatically gain the basic coverages offered by the CSPM plan. To gain access to the other capabilities provided by Defender CSPM, you'll need to [enable the Defender Cloud Security Posture Management (CSPM) plan](enable-enhanced-security.md) to your subscription and resources.
24+
The Defender CSPM plan comes with two options, foundational CSPM capabilities and Defender Cloud Security Posture Management (CSPM). When you deploy Defender for Cloud to your subscription and resources, you'll automatically gain the basic coverage offered by the CSPM plan. To gain access to the other capabilities provided by Defender CSPM, you'll need to [enable the Defender Cloud Security Posture Management (CSPM) plan](enable-enhanced-security.md) on your subscription and resources.
2525

2626
The following table summarizes what's included in each plan and their cloud availability.
2727

articles/defender-for-cloud/iac-vulnerabilities.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Once you have set up the Microsoft Security DevOps GitHub action or Azure DevOps
3030

3131
```yml
3232
with:
33-
categories: 'Iac"
33+
categories: 'IaC'
3434
```
3535
3636
> [!NOTE]

articles/hdinsight/spark/apache-spark-manage-dependencies.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ ms.author: apsinhar
66
ms.service: hdinsight
77
ms.custom: hdinsightactive, ignite-2022
88
ms.topic: how-to
9-
ms.date: 07/22/2022
9+
ms.date: 10/18/2022
1010
#Customer intent: As a developer for Apache Spark and Apache Spark in Azure HDInsight, I want to learn how to manage my Spark application dependencies and install packages on my HDInsight cluster.
1111
---
1212

@@ -39,7 +39,7 @@ You'll use the `%%configure` magic to configure the notebook to use an external
3939

4040
After locating the package from Maven Repository, gather the values for **GroupId**, **ArtifactId**, and **Version**. Concatenate the three values, separated by a colon (**:**).
4141

42-
:::image type="content" source="./media/apache-spark-manage-dependencies/spark-package-schema.png " alt-text="Concatenate package schema" border="true":::kage schema" border="true":::
42+
:::image type="content" source="./media/apache-spark-manage-dependencies/spark-package-schema.png " alt-text="Concatenate package schema" border="true":::
4343

4444
Make sure the values you gather match your cluster. In this case, we're using Spark Azure Cosmos DB connector package for Scala 2.11 and Spark 2.3 for HDInsight 3.6 Spark cluster. If you are not sure, run `scala.util.Properties.versionString` in code cell on Spark kernel to get cluster Scala version. Run `sc.version` to get cluster Spark version.
4545

0 commit comments

Comments
 (0)