You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/api-management/api-management-configuration-repository-git.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -218,7 +218,7 @@ The final setting, `$ref-policy`, maps to the global policy statements file for
218
218
The `apis` folder contains a folder for each API in the service instance, which contains the following items.
219
219
220
220
*`apis\<api name>\configuration.json` - this is the configuration for the API and contains information about the backend service URL and the operations. This is the same information that would be returned if you were to call [Get a specific API](https://docs.microsoft.com/rest/api/apimanagement/2019-01-01/apis/get) with `export=true` in `application/json` format.
221
-
*`apis\<api name>\api.description.html` - this is the description of the API and corresponds to the `description` property of the [API entity](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.table._entity_property).
221
+
*`apis\<api name>\api.description.html` - this is the description of the API and corresponds to the `description` property of the [API entity](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.table.entityproperty).
222
222
*`apis\<api name>\operations\` - this folder contains `<operation name>.description.html` files that map to the operations in the API. Each file contains the description of a single operation in the API, which maps to the `description` property of the [operation entity](https://docs.microsoft.com/rest/api/visualstudio/operations/list#operationproperties) in the REST API.
Copy file name to clipboardExpand all lines: articles/azure-databricks/databricks-connect-to-data-sources.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -42,7 +42,7 @@ The following list provides the data sources in Azure that you can use with Azur
42
42
43
43
This link provides instructions on how to use the [Azure Event Hubs Spark connector](https://github.com/Azure/azure-event-hubs-spark) from Azure Databricks to access data in Azure Event Hubs.
44
44
45
-
-[Azure SQL Data Warehouse](/azure/databricks/data/data-sources/azure/sql-data-warehouse)
45
+
-[Azure SQL Data Warehouse](/azure/synapse-analytics/sql-data-warehouse/)
46
46
47
47
This link provides instructions on how to use the Azure SQL Data Warehouse connector to connect from Azure Databricks.
Copy file name to clipboardExpand all lines: articles/azure-government/documentation-government-get-started-connect-to-storage.md
+59-59Lines changed: 59 additions & 59 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -18,7 +18,7 @@ ms.author: femila
18
18
19
19
# Develop with Storage API on Azure Government
20
20
21
-
Azure Government uses the same underlying technologies as commercial Azure, enabling you to use the development tools you’re already familiar with.
21
+
Azure Government uses the same underlying technologies as commercial Azure, enabling you to use the development tools you're already familiar with.
22
22
To use these services in Azure Government, you must define different endpoint mappings, as shown below for the Storage service.
23
23
24
24
If you don't have an Azure Government subscription, create a [free account](https://azure.microsoft.com/global-infrastructure/government/request/) before you begin.
@@ -35,7 +35,7 @@ If you don't have an Azure Government subscription, create a [free account](http
35
35
### Getting Started with Storage Explorer
36
36
1. Open the Azure Storage Explorer desktop application.
37
37
38
-
2. You'll be prompted to add an Azure account; in the dropdown choose the “Azure US Government” option:
38
+
2. You'll be prompted to add an Azure account; in the dropdown choose the "Azure US Government" option:
3. Sign in to your Azure Government account and you can see all of your resources. The Storage Explorer should look similar to the screenshot below. Click on your Storage Account to see the blob containers, file shares, Queues, and Tables.
@@ -52,10 +52,10 @@ If you don't have an Azure Government subscription, create a [free account](http
52
52
* Download Visual Studio 2019
53
53
54
54
### Getting Started with Storage API
55
-
One important difference to note when connecting with the Storage API is that the URL for storage is different than the URL for storage in commercial Azure – specifically, the domain ends with “core.usgovcloudapi.net”, rather than “core.windows.net”.
55
+
One important difference to note when connecting with the Storage API is that the URL for storage is different than the URL for storage in commercial Azure – specifically, the domain ends with "core.usgovcloudapi.net", rather than "core.windows.net".
56
56
57
57
These endpoint differences must be taken into account when you connect to storage in Azure Government with C#.
58
-
1. Go to the [Azure Government portal](https://portal.azure.us) and select your storage account and then click the “Access Keys” tab:
58
+
1. Go to the [Azure Government portal](https://portal.azure.us) and select your storage account and then click the "Access Keys" tab:
2. Copy/paste the storage account connection string.
@@ -64,13 +64,13 @@ These endpoint differences must be taken into account when you connect to storag
64
64
1. Open up Visual Studio and create a new project. Add a reference to the [WindowsAzure.Storage NuGet package](https://www.nuget.org/packages/WindowsAzure.Storage/). This NuGet package contains classes we will need to connect to your storage account.
65
65
66
66
2. Add these two lines of C# code to connect:
67
-
```cs
68
-
var credentials = new StorageCredentials(storageAccountName, storageAccountKey);
- Notice on the second line we had to use a [particular constructor for the CloudStorageAccount](https://docs.microsoft.com/java/api/com.microsoft.azure.storage._cloud_storage_account.cloudstorageaccount) – enabling us to explicitly pass in the endpoint suffix of “core.usgovcloudapi.net”. This constructor is the **only difference** your code requires to connect to storage in Azure Government as compared with commercial Azure.
73
+
-Noticeonthesecondlinewehadtousea [particularconstructorfortheCloudStorageAccount](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.cloudstorageaccount.cloudstorageaccount) – enabling us to explicitly pass in the endpoint suffix of "core.usgovcloudapi.net". This constructor is the **only difference** your code requires to connect to storage in Azure Government as compared with commercial Azure.
Copy file name to clipboardExpand all lines: articles/azure-government/documentation-government-services-storage.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -68,7 +68,7 @@ These are the URLs for storage accounts in Azure Government:
68
68
>
69
69
>
70
70
71
-
For more information on APIs, see the [Cloud Storage Account Constructor](https://docs.microsoft.com/java/api/com.microsoft.azure.storage._cloud_storage_account.cloudstorageaccount).
71
+
For more information on APIs, see the [Cloud Storage Account Constructor](https://docs.microsoft.com/java/api/com.microsoft.azure.storage.cloudstorageaccount.cloudstorageaccount).
72
72
73
73
The endpoint suffix to use in these overloads is *core.usgovcloudapi.net*.
Copy file name to clipboardExpand all lines: articles/machine-learning/how-to-create-your-first-pipeline.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -27,7 +27,7 @@ The ML pipelines you create are visible to the members of your Azure Machine Lea
27
27
28
28
ML pipelines use remote compute targets for computation and the storage of the intermediate and final data associated with that pipeline. They can read and write data to and from supported [Azure Storage](https://docs.microsoft.com/azure/storage/) locations.
29
29
30
-
If you don’t have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree).
30
+
If you don't have an Azure subscription, create a free account before you begin. Try the [free or paid version of Azure Machine Learning](https://aka.ms/AMLFree).
31
31
32
32
## Prerequisites
33
33
@@ -82,7 +82,7 @@ def_blob_store.upload_files(
82
82
overwrite=True)
83
83
```
84
84
85
-
A pipeline consists of one or more steps. A step is a unit run on a compute target. Steps might consume data sources and produce “intermediate” data. A step can create data such as a model, a directory with model and dependent files, or temporary data. This data is then available for other steps later in the pipeline.
85
+
A pipeline consists of one or more steps. A step is a unit run on a compute target. Steps might consume data sources and produce "intermediate" data. A step can create data such as a model, a directory with model and dependent files, or temporary data. This data is then available for other steps later in the pipeline.
86
86
87
87
To learn more about connecting your pipeline to your data, see the articles [How to Access Data](how-to-access-data.md) and [How to Register Datasets](how-to-create-register-datasets.md).
88
88
@@ -114,7 +114,7 @@ output_data1 = PipelineData(
114
114
115
115
If you have tabular data stored in a file or set of files, a [TabularDataset](https://docs.microsoft.com/python/api/azureml-core/azureml.data.tabulardataset?view=azure-ml-py) is an efficient alternative to a `DataReference`. `TabularDataset` objects support versioning, diffs, and summary statistics. `TabularDataset`s are lazily evaluated (like Python generators) and it's efficient to subset them by splitting or filtering. The `FileDataset` class provides similar lazily-evaluated data representing one or more files.
116
116
117
-
You create a `TabularDataset` using methods like [from_delimited_files](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py#from-delimited-files-path--validate-true--include-path-false--infer-column-types-true--set-column-types-none--separator------header-true--partition-format-none-).
117
+
You create a `TabularDataset` using methods like [from_delimited_files](https://docs.microsoft.com/python/api/azureml-core/azureml.data.dataset_factory.tabulardatasetfactory?view=azure-ml-py#from-delimited-files-path--validate-true--include-path-false--infer-column-types-true--set-column-types-none--separator------header-true--partition-format-none--support-multi-line-false-).
118
118
119
119
```python
120
120
from azureml.data import TabularDataset
@@ -385,7 +385,7 @@ When you first run a pipeline, Azure Machine Learning:
385
385
* Downloads the Docker image for each step to the compute target from the container registry.
386
386
* Mounts the datastore if a `DataReference` object is specified in a step. If mount is not supported, the data is instead copied to the compute target.
387
387
* Runs the step in the compute target specified in the step definition.
388
-
* Creates artifacts, such as logs, stdout and stderr, metrics, and output specified by the step. These artifacts are then uploaded and kept in the user’s default datastore.
388
+
* Creates artifacts, such as logs, stdout and stderr, metrics, and output specified by the step. These artifacts are then uploaded and kept in the user's default datastore.
389
389
390
390

Copy file name to clipboardExpand all lines: articles/site-recovery/azure-to-azure-common-questions.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -88,7 +88,7 @@ By using Site Recovery, you can replicate and recover VMs between any two region
88
88
89
89
### Does Site Recovery require internet connectivity?
90
90
91
-
No, Site Recovery doesn't require internet connectivity. But it does require access to Site Recovery URLs and IP ranges, as mentioned in [networking in Azure VM disaster recovery](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-ip-address-ranges).
91
+
No, Site Recovery doesn't require internet connectivity. But it does require access to Site Recovery URLs and IP ranges, as mentioned in [networking in Azure VM disaster recovery](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-urls).
92
92
93
93
### Can I replicate an application that has a separate resource group for separate tiers?
Copy file name to clipboardExpand all lines: articles/site-recovery/azure-to-azure-troubleshoot-replication.md
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -76,7 +76,7 @@ We recommend creating a network service endpoint in your virtual network for "St
76
76
77
77
### Network connectivity
78
78
79
-
For Site Recovery replication to work, it needs the VM to provide outbound connectivity to specific URLs or IP ranges. You might have your VM behind a firewall or use network security group (NSG) rules to control outbound connectivity. If so, you might experience issues. To make sure all the URLs are connected, see [Outbound connectivity for Site Recovery URLs](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-ip-address-ranges).
79
+
For Site Recovery replication to work, it needs the VM to provide outbound connectivity to specific URLs or IP ranges. You might have your VM behind a firewall or use network security group (NSG) rules to control outbound connectivity. If so, you might experience issues. To make sure all the URLs are connected, see [Outbound connectivity for Site Recovery URLs](https://docs.microsoft.com/azure/site-recovery/azure-to-azure-about-networking#outbound-connectivity-for-urls).
80
80
81
81
## Error ID 153006 - No app-consistent recovery point available for the VM in the past "X" minutes
> You might encounter slightly different schema examples. The change in schema occurred in the October 2016 release. For details, see [Update from a previous format](#update-from-a-previous-format).
0 commit comments