Skip to content

Commit 6c09358

Browse files
authored
Merge pull request #98752 from tamram/tamram-1211
account creation and keys - rename, redirect, update links
2 parents 1ab94e8 + 6c60c5c commit 6c09358

File tree

45 files changed

+151
-182
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

45 files changed

+151
-182
lines changed

.openpublishing.redirection.json

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23069,6 +23069,16 @@
2306923069
"redirect_url": "/azure/storage/common/storage-quickstart-create-account",
2307023070
"redirect_document_id": true
2307123071
},
23072+
{
23073+
"source_path": "articles/storage/common/storage-quickstart-create-account.md",
23074+
"redirect_url": "/azure/storage/common/storage-account-create",
23075+
"redirect_document_id": true
23076+
},
23077+
{
23078+
"source_path": "articles/storage/common/storage-account-manage.md",
23079+
"redirect_url": "/azure/storage/common/storage-account-keys-manage",
23080+
"redirect_document_id": true
23081+
},
2307223082
{
2307323083
"source_path": "articles/storage/common/storage-account-options.md",
2307423084
"redirect_url": "/azure/storage/common/storage-account-overview",

articles/active-directory-domain-services/security-audit-events.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,7 @@ The following table outlines scenarios for each destination resource type.
6666
6767
| Target Resource | Scenario |
6868
|:---|:---|
69-
|Azure Storage| This target should be used when your primary need is to store security audit events for archival purposes. Other targets can be used for archival purposes, however those targets provide capabilities beyond the primary need of archiving. Before you enable Azure AD DS security audit events, [Create an Azure storage account](../storage/common/storage-quickstart-create-account.md?tabs=azure-portal#create-a-storage-account-1).|
69+
|Azure Storage| This target should be used when your primary need is to store security audit events for archival purposes. Other targets can be used for archival purposes, however those targets provide capabilities beyond the primary need of archiving. Before you enable Azure AD DS security audit events, first [Create an Azure Storage account](../storage/common/storage-account-create.md).|
7070
|Azure Event Hubs| This target should be used when your primary need is to share security audit events with additional software such as data analysis software or security information & event management (SIEM) software. Before you enable Azure AD DS security audit events, [Create an event hub using Azure portal](https://docs.microsoft.com/azure/event-hubs/event-hubs-create)|
7171
|Azure Log Analytics Workspace| This target should be used when your primary need is to analyze and review secure audits from the Azure portal directly. Before you enable Azure AD DS security audit events, [Create a Log Analytics workspace in the Azure portal.](https://docs.microsoft.com/azure/azure-monitor/learn/quick-create-workspace)|
7272

articles/azure-databricks/quickstart-create-databricks-workspace-resource-manager-template.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -118,7 +118,7 @@ Perform the following tasks to create a notebook in Databricks, configure the no
118118

119119
spark.conf.set("fs.azure.account.key.{YOUR STORAGE ACCOUNT NAME}.blob.core.windows.net", "{YOUR STORAGE ACCOUNT ACCESS KEY}")
120120

121-
For instructions on how to retrieve the storage account key, see [Manage your storage access keys](../storage/common/storage-account-manage.md#access-keys).
121+
For information about how to retrieve the storage account access keys, see [Manage storage account access keys](../storage/common/storage-account-keys-manage.md).
122122

123123
> [!NOTE]
124124
> You can also use Azure Data Lake Store with a Spark cluster on Azure Databricks. For instructions, see [Use Data Lake Store with Azure Databricks](/azure/databricks/data/data-sources/azure/azure-datalake-gen2).

articles/azure-government/documentation-government-services-storage.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ ms.author: zakramer
2121
# Azure Government storage
2222

2323
## Azure Storage
24-
Azure Storage is generally available in Azure Government. For a Quickstart that will help you get started with Storage in Azure Government, [click here](documentation-government-get-started-connect-to-storage.md). For general details on Azure Storage, see [Azure Storage public documentation](../storage/index.yml).
24+
Azure Storage is generally available in Azure Government. For a Quickstart that will help you get started with Storage in Azure Government, see [Develop with Storage API on Azure Government](documentation-government-get-started-connect-to-storage.md). For general details on Azure Storage, see [Azure Storage public documentation](../storage/index.yml).
2525

2626
### Storage pairing in Azure Government
2727
The following map shows the primary and secondary region pairings used for geo-redundant storage and read-access geo-redundant storage accounts in Azure Government.
@@ -99,14 +99,15 @@ With Import/Export jobs for USGov Arizona or USGov Texas, the mailing address is
9999

100100
For DoD L5 data, use a DoD region storage account to ensure that data is loaded directly into the DoD regions.
101101

102-
For all jobs, we recommend that you rotate your storage account keys after the job is complete to remove any access granted during the process. For more information, see [Managing storage accounts](../storage/common/storage-account-manage.md#access-keys).
102+
For all jobs, we recommend that you rotate your storage account keys after the job is complete to remove any access granted during the process. For more information, see [Manage storage account access keys](../storage/common/storage-account-keys-manage.md).
103103

104104
| Regulated/controlled data permitted | Regulated/controlled data not permitted |
105105
| --- | --- |
106106
| Data copied to the media for transport and the keys used to encrypt that data. | Azure Import/Export metadata cannot contain controlled data. This metadata includes all configuration data that's entered when you're creating your Import/Export job and shipping information that's used to transport your media. Do not enter regulated/controlled data in the following fields: **Job name**, **Carrier name**, **Tracking number**, **Description**, **Return information (Name, Address, Phone, E-Mail)**, **Export Blob URI**, **Drive list**, **Package list**, **Storage account name**, **Container name**. |
107107

108108
## Azure Backup Service
109-
For detailed documentation on using the Azure Backup Service in Azure Government, [click here](documentation-government-services-backup.md).
109+
For detailed documentation on using the Azure Backup Service in Azure Government, see [Azure Government Backup](documentation-government-services-backup.md).
110+
110111
## Next steps
111112
For supplemental information and updates, subscribe to the
112113
<a href="https://blogs.msdn.microsoft.com/azuregov/">Microsoft Azure Government blog. </a>

articles/backup/backup-azure-configure-reports.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ If you wish to customize and share the report, create a workspace and do the fol
8282
4. Enter the name of the storage account that was configured in the previous step 5, and select **Next**.
8383

8484
![Enter storage account name](./media/backup-azure-configure-reports/content-pack-storage-account-name.png)
85-
5. Using Authentication method "Key", enter the storage account key for this storage account. To [view and copy storage access keys](../storage/common/storage-account-manage.md#access-keys), go to your storage account in the Azure portal.
85+
5. Using Authentication method "Key", enter the storage account key for this storage account. You can find your storage account access keys in the Azure portal. For more information, see [Manage storage account access keys](../storage/common/storage-account-keys-manage.md).
8686

8787
![Enter storage account](./media/backup-azure-configure-reports/content-pack-storage-account-key.png) <br/>
8888

articles/batch/virtual-file-mount.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ new PoolAddParameter
8181

8282
### Azure Blob file system
8383

84-
Another option is to use Azure Blob storage via [blobfuse](../storage/blobs/storage-how-to-mount-container-linux.md). Mounting a blob file system requires an `AccountKey` or `SasKey` for your storage account. For information on getting these keys, see [View account keys](../storage/common/storage-account-manage.md#view-account-keys-and-connection-string), or [Using shared access signatures (SAS)](../storage/common/storage-dotnet-shared-access-signature-part-1.md). For more information on using blobfuse, see the blobfuse [Troubleshoot FAQ](https://github.com/Azure/azure-storage-fuse/wiki/3.-Troubleshoot-FAQ). To get default access to the blobfuse mounted directory, run the task as an **Administrator**. Blobfuse mounts the directory at the user space, and at pool creation it is mounted as root. In Linux all **Administrator** tasks are root. All options for the FUSE module is described in the [FUSE reference page](http://manpages.ubuntu.com/manpages/xenial/man8/mount.fuse.8.html).
84+
Another option is to use Azure Blob storage via [blobfuse](../storage/blobs/storage-how-to-mount-container-linux.md). Mounting a blob file system requires an `AccountKey` or `SasKey` for your storage account. For information on getting these keys, see [Manage storage account access keys](../storage/common/storage-account-keys-manage.md), or [Using shared access signatures (SAS)](../storage/common/storage-dotnet-shared-access-signature-part-1.md). For more information on using blobfuse, see the blobfuse [Troubleshoot FAQ](https://github.com/Azure/azure-storage-fuse/wiki/3.-Troubleshoot-FAQ). To get default access to the blobfuse mounted directory, run the task as an **Administrator**. Blobfuse mounts the directory at the user space, and at pool creation it is mounted as root. In Linux all **Administrator** tasks are root. All options for the FUSE module is described in the [FUSE reference page](http://manpages.ubuntu.com/manpages/xenial/man8/mount.fuse.8.html).
8585

8686
In addition to the troubleshooting guide, GitHub issues in the blobfuse repository are a helpful way to check on current blobfuse issues and resolutions. For more information, see [blobfuse issues](https://github.com/Azure/azure-storage-fuse/issues).
8787

articles/data-factory/v1/data-factory-build-your-first-pipeline-using-editor.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@ In this step, you link your storage account to your data factory. In this tutori
105105

106106
![Storage linked service](./media/data-factory-build-your-first-pipeline-using-editor/azure-storage-linked-service.png)
107107

108-
1. Replace **account name** with the name of your storage account. Replace **account key** with the access key of the storage account. To learn how to get your storage access key, see how to view, copy, and regenerate storage access keys in [Manage your storage account](../../storage/common/storage-account-manage.md#access-keys).
108+
1. Replace **account name** with the name of your storage account. Replace **account key** with the access key of the storage account. To learn how to get your storage access key, see [Manage storage account access keys](../../storage/common/storage-account-keys-manage.md).
109109

110110
1. Select **Deploy** on the command bar to deploy the linked service.
111111

articles/data-factory/v1/data-factory-build-your-first-pipeline-using-powershell.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -112,7 +112,7 @@ In this step, you link your Azure Storage account to your data factory. You use
112112
}
113113
}
114114
```
115-
Replace **account name** with the name of your Azure storage account and **account key** with the access key of the Azure storage account. To learn how to get your storage access key, see the information about how to view, copy, and regenerate storage access keys in [Manage your storage account](../../storage/common/storage-account-manage.md#access-keys).
115+
Replace **account name** with the name of your Azure storage account and **account key** with the access key of the Azure storage account. To learn how to get your storage access key, see [Manage storage account access keys](../../storage/common/storage-account-keys-manage.md).
116116
2. In Azure PowerShell, switch to the ADFGetStarted folder.
117117
3. You can use the **New-AzDataFactoryLinkedService** cmdlet that creates a linked service. This cmdlet and other Data Factory cmdlets you use in this tutorial requires you to pass values for the *ResourceGroupName* and *DataFactoryName* parameters. Alternatively, you can use **Get-AzDataFactory** to get a **DataFactory** object and pass the object without typing *ResourceGroupName* and *DataFactoryName* each time you run a cmdlet. Run the following command to assign the output of the **Get-AzDataFactory** cmdlet to a **$df** variable.
118118

articles/data-factory/v1/data-factory-build-your-first-pipeline-using-rest-api.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -79,7 +79,7 @@ Create following JSON files in the folder where curl.exe is located.
7979

8080
### azurestoragelinkedservice.json
8181
> [!IMPORTANT]
82-
> Replace **accountname** and **accountkey** with name and key of your Azure storage account. To learn how to get your storage access key, see the information about how to view, copy, and regenerate storage access keys in [Manage your storage account](../../storage/common/storage-account-manage.md#access-keys).
82+
> Replace **accountname** and **accountkey** with name and key of your Azure storage account. To learn how to get your storage access key, see [Manage storage account access keys](../../storage/common/storage-account-keys-manage.md).
8383
>
8484
>
8585

articles/data-factory/v1/data-factory-build-your-first-pipeline-using-vs.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -87,7 +87,7 @@ With on-demand HDInsight linked service, The HDInsight cluster is automatically
8787
1. Right-click **Linked Services** in the solution explorer, point to **Add**, and click **New Item**.
8888
2. In the **Add New Item** dialog box, select **Azure Storage Linked Service** from the list, and click **Add**.
8989
![Azure Storage Linked Service](./media/data-factory-build-your-first-pipeline-using-vs/new-azure-storage-linked-service.png)
90-
3. Replace `<accountname>` and `<accountkey>` with the name of your Azure storage account and its key. To learn how to get your storage access key, see the information about how to view, copy, and regenerate storage access keys in [Manage your storage account](../../storage/common/storage-account-manage.md#access-keys).
90+
3. Replace `<accountname>` and `<accountkey>` with the name of your Azure storage account and its key. To learn how to get your storage access key, see [Manage storage account access keys](../../storage/common/storage-account-keys-manage.md).
9191
![Azure Storage Linked Service](./media/data-factory-build-your-first-pipeline-using-vs/azure-storage-linked-service.png)
9292
4. Save the **AzureStorageLinkedService1.json** file.
9393

0 commit comments

Comments
 (0)