Skip to content

Commit 4c70163

Browse files
committed
Merge branch 'master' of https://github.com/MicrosoftDocs/azure-docs-pr into heidist-traffic
2 parents 779c0c4 + eaa89c1 commit 4c70163

14 files changed

+163
-67
lines changed

articles/cosmos-db/sql-query-keywords.md

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -4,24 +4,25 @@ description: Learn about SQL keywords for Azure Cosmos DB.
44
author: markjbrown
55
ms.service: cosmos-db
66
ms.topic: conceptual
7-
ms.date: 06/20/2019
7+
ms.date: 03/17/2020
88
ms.author: mjbrown
99

1010
---
1111
# Keywords in Azure Cosmos DB
12+
1213
This article details keywords which may be used in Azure Cosmos DB SQL queries.
1314

1415
## BETWEEN
1516

16-
As in ANSI SQL, you can use the BETWEEN keyword to express queries against ranges of string or numerical values. For example, the following query returns all items in which the first child's grade is 1-5, inclusive.
17+
You can use the `BETWEEN` keyword to express queries against ranges of string or numerical values. For example, the following query returns all items in which the first child's grade is 1-5, inclusive.
1718

1819
```sql
1920
SELECT *
2021
FROM Families.children[0] c
2122
WHERE c.grade BETWEEN 1 AND 5
2223
```
2324

24-
Unlike in ANSI SQL, you can also use the BETWEEN clause in the FROM clause, as in the following example.
25+
You can also use the `BETWEEN` keyword in the `SELECT` clause, as in the following example.
2526

2627
```sql
2728
SELECT (c.grade BETWEEN 0 AND 10)
@@ -31,11 +32,11 @@ Unlike in ANSI SQL, you can also use the BETWEEN clause in the FROM clause, as i
3132
In SQL API, unlike ANSI SQL, you can express range queries against properties of mixed types. For example, `grade` might be a number like `5` in some items and a string like `grade4` in others. In these cases, as in JavaScript, the comparison between the two different types results in `Undefined`, so the item is skipped.
3233

3334
> [!TIP]
34-
> For faster query execution times, create an indexing policy that uses a range index type against any numeric properties or paths that the BETWEEN clause filters.
35+
> For faster query execution times, create an indexing policy that uses a range index type against any numeric properties or paths that the `BETWEEN` clause filters.
3536
3637
## DISTINCT
3738

38-
The DISTINCT keyword eliminates duplicates in the query's projection.
39+
The `DISTINCT` keyword eliminates duplicates in the query's projection.
3940

4041
In this example, the query projects values for each last name:
4142

@@ -97,7 +98,7 @@ The results are:
9798
]
9899
```
99100

100-
Queries with an aggregate system function and a subquery with DISTINCT are not supported. For example, the following query is not supported:
101+
Queries with an aggregate system function and a subquery with `DISTINCT` are not supported. For example, the following query is not supported:
101102

102103
```sql
103104
SELECT COUNT(1) FROM (SELECT DISTINCT f.lastName FROM f)
@@ -127,7 +128,7 @@ If you include your partition key in the `IN` filter, your query will automatica
127128

128129
## TOP
129130

130-
The TOP keyword returns the first `N` number of query results in an undefined order. As a best practice, use TOP with the ORDER BY clause to limit results to the first `N` number of ordered values. Combining these two clauses is the only way to predictably indicate which rows TOP affects.
131+
The TOP keyword returns the first `N` number of query results in an undefined order. As a best practice, use TOP with the `ORDER BY` clause to limit results to the first `N` number of ordered values. Combining these two clauses is the only way to predictably indicate which rows TOP affects.
131132

132133
You can use TOP with a constant value, as in the following example, or with a variable value using parameterized queries.
133134

articles/databox/data-box-deploy-copy-data-from-vhds.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,11 @@ This tutorial describes how to use the Azure Data Box to migrate you on-premises
1818
In this tutorial, you learn how to:
1919

2020
> [!div class="checklist"]
21+
>
2122
> * Review prerequisites
2223
> * Connect to Data Box
2324
> * Copy data to Data Box
2425
25-
2626
## Prerequisites
2727

2828
Before you begin, make sure that:
@@ -35,6 +35,8 @@ Before you begin, make sure that:
3535
- Supported [managed disk sizes in Azure object size limits](data-box-limits.md#azure-object-size-limits).
3636
- [Introduction to Azure managed disks](/azure/virtual-machines/windows/managed-disks-overview).
3737

38+
5. You've maintained a copy of the source data until you've confirmed that the Data Box transferred your data into Azure Storage.
39+
3840
## Connect to Data Box
3941

4042
Based on the resource groups specified, Data Box creates one share for each associated resource group. For example, if `mydbmdrg1` and `mydbmdrg2` were created when placing the order, the following shares are created:
@@ -88,7 +90,7 @@ If using a Windows Server host computer, follow these steps to connect to the Da
8890

8991
```
9092
C:\>net use \\169.254.250.200\mydbmdrgl_MDisk /u:mdisk
91-
Enter the password for mdisk to connect to '169.254.250.200':
93+
Enter the password for 'mdisk' to connect to '169.254.250.200':
9294
The command completed successfully.
9395
C: \>
9496
```

articles/databox/data-box-deploy-copy-data-via-copy-service.md

Lines changed: 8 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -24,6 +24,7 @@ Use the data copy service:
2424
In this tutorial, you learn how to:
2525

2626
> [!div class="checklist"]
27+
>
2728
> * Copy data to Data Box
2829
2930
## Prerequisites
@@ -39,9 +40,13 @@ Before you begin, make sure that:
3940

4041
After you're connected to the NAS device, the next step is to copy your data. Before you begin the data copy, review the following considerations:
4142

42-
- While copying data, make sure that the data size conforms to the size limits described in the article [Azure storage and Data Box limits](data-box-limits.md).
43-
- If data uploaded by Data Box is concurrently uploaded by other applications outside Data Box, upload-job failures and data corruption might result.
44-
- If the data is being modified as the data copy service is reading it, you might see failures or corruption of data.
43+
* While copying data, make sure that the data size conforms to the size limits described in the article [Azure storage and Data Box limits](data-box-limits.md).
44+
45+
* If data uploaded by Data Box is concurrently uploaded by other applications outside Data Box, upload-job failures and data corruption might result.
46+
47+
* If the data is being modified as the data copy service is reading it, you might see failures or corruption of data.
48+
49+
* Make sure that you maintain a copy of the source data until you can confirm that the Data Box has transferred your data into Azure Storage.
4550

4651
To copy data by using the data copy service, you need to create a job:
4752

articles/databox/data-box-deploy-copy-data-via-nfs.md

Lines changed: 11 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -18,6 +18,7 @@ This tutorial describes how to connect to and copy data from your host computer
1818
In this tutorial, you learn how to:
1919

2020
> [!div class="checklist"]
21+
>
2122
> * Prerequisites
2223
> * Connect to Data Box
2324
> * Copy data to Data Box
@@ -79,17 +80,17 @@ If you are using a Linux host computer, perform the following steps to configure
7980

8081
Once you are connected to the Data Box shares, the next step is to copy data. Before you begin the data copy, review the following considerations:
8182

82-
- Ensure that you copy the data to shares that correspond to the appropriate data format. For instance, copy the block blob data to the share for block blobs. Copy VHDs to page blobs. If the data format does not match the appropriate share type, then at a later step, the data upload to Azure will fail.
83-
- While copying data, ensure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
84-
- If data, which is being uploaded by Data Box, is concurrently uploaded by other applications outside of Data Box, then this could result in upload job failures and data corruption.
85-
- We recommend that you do not use both SMB and NFS concurrently or copy same data to same end destination on Azure. In such cases, the final outcome cannot be determined.
86-
- **Always create a folder for the files that you intend to copy under the share and then copy the files to that folder**. The folder created under block blob and page blob shares represents a container to which data is uploaded as blobs. You cannot copy files directly to *root* folder in the storage account.
87-
- If ingesting case-sensitive directory and file names from an NFS share to NFS on Data Box:
88-
- The case is preserved in the name.
89-
- The files are case-insensitive.
90-
91-
For example, if copying `SampleFile.txt` and `Samplefile.Txt`, the case will be preserved in the name when copied to Data Box but the second file will overwrite the first one as these are considered the same file.
83+
* Ensure that you copy the data to shares that correspond to the appropriate data format. For instance, copy the block blob data to the share for block blobs. Copy VHDs to page blobs. If the data format does not match the appropriate share type, then at a later step, the data upload to Azure will fail.
84+
* While copying data, ensure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
85+
* If data, which is being uploaded by Data Box, is concurrently uploaded by other applications outside of Data Box, then this could result in upload job failures and data corruption.
86+
* We recommend that you do not use both SMB and NFS concurrently or copy same data to same end destination on Azure. In such cases, the final outcome cannot be determined.
87+
* **Always create a folder for the files that you intend to copy under the share and then copy the files to that folder**. The folder created under block blob and page blob shares represents a container to which data is uploaded as blobs. You cannot copy files directly to *root* folder in the storage account.
88+
* If ingesting case-sensitive directory and file names from an NFS share to NFS on Data Box:
89+
* The case is preserved in the name.
90+
* The files are case-insensitive.
9291

92+
For example, if copying `SampleFile.txt` and `Samplefile.Txt`, the case will be preserved in the name when copied to Data Box but the second file will overwrite the first one as these are considered the same file.
93+
* Make sure that you maintain a copy of the source data until you can confirm that the Data Box has transferred your data into Azure Storage.
9394

9495
If you're using a Linux host computer, use a copy utility similar to Robocopy. Some of the alternatives available in Linux are [rsync](https://rsync.samba.org/), [FreeFileSync](https://www.freefilesync.org/), [Unison](https://www.cis.upenn.edu/~bcpierce/unison/), or [Ultracopier](https://ultracopier.first-world.info/).
9596

articles/databox/data-box-deploy-copy-data-via-rest.md

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -20,6 +20,7 @@ This tutorial describes procedures to connect to Azure Data Box Blob storage via
2020
In this tutorial, you learn how to:
2121

2222
> [!div class="checklist"]
23+
>
2324
> * Prerequisites
2425
> * Connect to Data Box Blob storage via *http* or *https*
2526
> * Copy data to Data Box
@@ -88,7 +89,7 @@ Use the Azure portal to download certificate.
8889

8990
### Import certificate
9091

91-
Accessing Data Box Blob storage over HTTPS requires an SSL certificate for the device. The way in which this certificate is made available to the client application varies from application to application and across operating systems and distributions. Some applications can access the certificate after it is imported into the systems certificate store, while other applications do not make use of that mechanism.
92+
Accessing Data Box Blob storage over HTTPS requires an SSL certificate for the device. The way in which this certificate is made available to the client application varies from application to application and across operating systems and distributions. Some applications can access the certificate after it is imported into the system's certificate store, while other applications do not make use of that mechanism.
9293

9394
Specific information for some applications is mentioned in this section. For more information on other applications, consult the documentation for the application and the operating system used.
9495

@@ -105,16 +106,16 @@ Follow these steps to import the `.cer` file into the root store of a Windows or
105106
106107
#### Use Windows Server UI
107108
108-
1. Right-click the `.cer` file and select **Install certificate**. This action starts the Certificate Import Wizard.
109-
2. For **Store location**, select **Local Machine**, and then click **Next**.
109+
1. Right-click the `.cer` file and select **Install certificate**. This action starts the Certificate Import Wizard.
110+
2. For **Store location**, select **Local Machine**, and then click **Next**.
110111
111112
![Import certificate using PowerShell](media/data-box-deploy-copy-data-via-rest/import-cert-ws-1.png)
112113
113-
3. Select **Place all certificates in the following store**, and then click **Browse**. Navigate to the root store of your remote host, and then click **Next**.
114+
3. Select **Place all certificates in the following store**, and then click **Browse**. Navigate to the root store of your remote host, and then click **Next**.
114115
115116
![Import certificate using PowerShell](media/data-box-deploy-copy-data-via-rest/import-cert-ws-2.png)
116117
117-
4. Click **Finish**. A message that tells you that the import was successful appears.
118+
4. Click **Finish**. A message that tells you that the import was successful appears.
118119
119120
![Import certificate using PowerShell](media/data-box-deploy-copy-data-via-rest/import-cert-ws-3.png)
120121
@@ -146,8 +147,9 @@ Follow the steps to [Configure partner software that you used while connecting o
146147
147148
Once you are connected to the Data Box Blob storage, the next step is to copy data. Prior to data copy, review the following considerations:
148149
149-
- While copying data, ensure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
150-
- If data, which is being uploaded by Data Box, is concurrently uploaded by other applications outside of Data Box, this may result in upload job failures and data corruption.
150+
* While copying data, ensure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
151+
* If data, which is being uploaded by Data Box, is concurrently uploaded by other applications outside of Data Box, this may result in upload job failures and data corruption.
152+
* Make sure that you maintain a copy of the source data until you can confirm that the Data Box has transferred your data into Azure Storage.
151153
152154
In this tutorial, AzCopy is used to copy data to Data Box Blob storage. You can also use Azure Storage Explorer (if you prefer a GUI-based tool) or a partner software to copy the data.
153155

articles/databox/data-box-deploy-copy-data.md

Lines changed: 9 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -105,22 +105,21 @@ If using a Windows Server host computer, follow these steps to connect to the Da
105105
If using a Linux client, use the following command to mount the SMB share. The "vers" parameter below is the version of SMB that your Linux host supports. Plug in the appropriate version in the command below. For versions of SMB that the Data Box supports see [Supported file systems for Linux clients](https://docs.microsoft.com/azure/databox/data-box-system-requirements#supported-file-systems-for-linux-clients)
106106
107107
`sudo mount -t nfs -o vers=2.1 10.126.76.172:/devicemanagertest1_BlockBlob /home/databoxubuntuhost/databox`
108-
109-
110108
111109
## Copy data to Data Box
112110
113111
Once you're connected to the Data Box shares, the next step is to copy data. Before you begin the data copy, review the following considerations:
114112
115-
- Make sure that you copy the data to shares that correspond to the appropriate data format. For instance, copy the block blob data to the share for block blobs. Copy the VHDs to page blob. If the data format doesn't match the appropriate share type, then at a later step, the data upload to Azure will fail.
116-
- While copying data, make sure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
117-
- If data, which is being uploaded by Data Box, is concurrently uploaded by other applications outside of Data Box, then this could result in upload job failures and data corruption.
118-
- We recommend that:
119-
- You don't use both SMB and NFS at the same time.
120-
- Copy the same data to same end destination on Azure.
121-
113+
* Make sure that you copy the data to shares that correspond to the appropriate data format. For instance, copy the block blob data to the share for block blobs. Copy the VHDs to page blob. If the data format doesn't match the appropriate share type, then at a later step, the data upload to Azure will fail.
114+
* While copying data, make sure that the data size conforms to the size limits described in the [Azure storage and Data Box limits](data-box-limits.md).
115+
* If data, which is being uploaded by Data Box, is concurrently uploaded by other applications outside of Data Box, then this could result in upload job failures and data corruption.
116+
* We recommend that:
117+
* You don't use both SMB and NFS at the same time.
118+
* Copy the same data to same end destination on Azure.
119+
122120
In these cases, the final outcome can't be determined.
123-
- Always create a folder for the files that you intend to copy under the share and then copy the files to that folder. The folder created under block blob and page blob shares represents a container to which the data is uploaded as blobs. You cannot copy files directly to *root* folder in the storage account.
121+
* Always create a folder for the files that you intend to copy under the share and then copy the files to that folder. The folder created under block blob and page blob shares represents a container to which the data is uploaded as blobs. You cannot copy files directly to *root* folder in the storage account.
122+
* Make sure that you maintain a copy of the source data until you can confirm that the Data Box has transferred your data into Azure Storage.
124123
125124
After you've connected to the SMB share, begin data copy. You can use any SMB compatible file copy tool such as Robocopy to copy your data. Multiple copy jobs can be initiated using Robocopy. Use the following command:
126125

articles/virtual-wan/TOC.yml

Lines changed: 14 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -52,26 +52,28 @@
5252
href: virtual-wan-site-to-site-portal.md
5353
- name: User VPN (point-to-site)
5454
items:
55-
- name: Create User VPN (point-to-site) connections
55+
- name: Create a User VPN connection
5656
href: virtual-wan-point-to-site-portal.md
57-
- name: Configure certificates for User VPN
57+
- name: Configure certificates
5858
href: certificates-point-to-site.md
59-
- name: Configure Azure AD tenant for User VPN
59+
- name: Configure Azure AD tenant
6060
href: openvpn-azure-ad-tenant.md
61-
- name: Configure Always On VPN user tunnel
62-
href: ../vpn-gateway/vpn-gateway-howto-always-on-user-tunnel.md?toc=%2fazure%2fvirtual-wan%2ftoc.json
63-
- name: Configure Always On VPN device tunnel
64-
href: ../vpn-gateway/vpn-gateway-howto-always-on-device-tunnel.md?toc=%2fazure%2fvirtual-wan%2ftoc.json
61+
- name: Download a VPN profile
62+
href: ../vpn-gateway/about-vpn-profile-download.md?toc=%2fazure%2fvirtual-wan%2ftoc.json
63+
- name: Download global and hub-based profiles
64+
href: global-hub-profile.md
6565
- name: Configure OpenVPN clients
6666
href: ../vpn-gateway/vpn-gateway-howto-openvpn-clients.md?toc=%2fazure%2fvirtual-wan%2ftoc.json
67-
- name: Enable Multi-Factor Authentication(MFA) for User VPN
67+
- name: Configure Always On VPN user tunnel
68+
href: howto-always-on-user-tunnel.md
69+
- name: Configure Always On VPN device tunnel
70+
href: howto-always-on-device-tunnel.md
71+
- name: Enable Multi-Factor Authentication(MFA)
6872
href: openvpn-azure-ad-mfa.md
69-
- name: Configure Azure AD authentication for User VPN
73+
- name: Configure Azure AD authentication
7074
href: virtual-wan-point-to-site-azure-ad.md
71-
- name: Configure Multi-application Azure AD authentication for User VPN
75+
- name: Configure Multi-application Azure AD authentication
7276
href: openvpn-azure-ad-tenant-multi-app.md
73-
- name: Download global and hub-based User VPN profiles
74-
href: global-hub-profile.md
7577
- name: Configure routing
7678
items:
7779
- name: Route traffic from a virtual hub to an NVA

0 commit comments

Comments
 (0)