Skip to content

Commit 1299fe7

Browse files
committed
Updating titles for migrate docs
1 parent 35ea113 commit 1299fe7

File tree

4 files changed

+48
-35
lines changed

4 files changed

+48
-35
lines changed

articles/cosmos-db/cassandra-import-data.md

Lines changed: 11 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -13,9 +13,9 @@ ms.date: 11/15/2017
1313
ms.author: govindk
1414
ms.custom: mvc
1515
---
16-
# Azure Cosmos DB: Import Cassandra data
16+
# Migrate your data to Azure Cosmos DB Cassandra API account
1717

18-
This tutorial provides instructions on importing Cassandra data into Azure Cosmos DB using the Cassandra Query Language (CQL) COPY command.
18+
This tutorial provides instructions on importing Cassandra data into Azure Cosmos DB by using the Cassandra Query Language (CQL) COPY command.
1919

2020
This tutorial covers the following tasks:
2121

@@ -26,11 +26,13 @@ This tutorial covers the following tasks:
2626
2727
# Prerequisites
2828

29-
* Install [Apache Cassandra](http://cassandra.apache.org/download/) and specifically ensure *cqlsh* is present.
30-
* Increase throughput: The duration of your data migration depends on the amount of throughput you provisioned for your Tables. Be sure to increase the throughput for larger data migrations. After you've completed the migration, decrease the throughput to save costs. For more information about increasing throughput in the [Azure portal](https://portal.azure.com), see [Set throughput for Azure Cosmos DB containers](set-throughput.md).
29+
* Install [Apache Cassandra](http://cassandra.apache.org/download/) and specifically ensure *cqlsh* is present.
30+
31+
* Increase throughput: The duration of your data migration depends on the amount of throughput you provisioned for your Tables. Be sure to increase the throughput for larger data migrations. After you've completed the migration, decrease the throughput to save costs. For more information about increasing throughput in the [Azure portal](https://portal.azure.com), see [Set throughput for Azure Cosmos DB containers](set-throughput.md).
32+
3133
* Enable SSL: Azure Cosmos DB has strict security requirements and standards. Be sure to enable SSL when you interact with your account. When you use CQL with SSH, you have an option to provide SSL information.
3234

33-
## Find your connection string
35+
## Get your connection string
3436

3537
1. In the [Azure portal](https://portal.azure.com), on the far left, click **Azure Cosmos DB**.
3638

@@ -40,14 +42,14 @@ This tutorial covers the following tasks:
4042

4143
![Connection string page](./media/cassandra-import-data/keys.png)
4244

43-
## Use cqlsh COPY
45+
## Migrate data by using cqlsh COPY
4446

4547
To import Cassandra data into Azure Cosmos DB for use with the Cassandra API, use the following guidance:
4648

4749
1. Log in to cqhsh using the connection information from the portal.
4850
2. Use the [CQL COPY command](http://cassandra.apache.org/doc/latest/tools/cqlsh.html#cqlsh) to copy local data to the Apache Cassandra API endpoint. Ensure the source and target are in same datacenter to minimize latency issues.
4951

50-
### Guide for moving data with cqlsh
52+
### Steps to move data with cqlsh
5153

5254
1. Pre-create and scale your table:
5355
* By default, Azure Cosmos DB provisions a new Cassandra API table with 1,000 request units per second (RU/s) (CQL-based creation is provisioned with 400 RU/s). Before you start the migration by using cqlsh, pre-create all your tables from the [Azure portal](https://portal.azure.com) or from cqlsh.
@@ -75,11 +77,11 @@ To import Cassandra data into Azure Cosmos DB for use with the Cassandra API, us
7577

7678
5. Run the final migration command. Running this command assumes you have started cqlsh using the connection string information.
7779

78-
```
80+
```bash
7981
COPY exampleks.tablename FROM filefolderx/*.csv
8082
```
8183
82-
## Use Spark to import data
84+
## Migrate data by using Spark
8385
8486
For data residing in an existing cluster in Azure virtual machines, importing data using Spark is also feasible option. This requires Spark to be set up as intermediary for one time or regular ingestion.
8587

articles/cosmos-db/import-data.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,12 @@ ms.date: 03/30/2018
1414
ms.author: dech
1515
ms.custom: mvc
1616
---
17-
# Azure Cosmos DB: Data migration tool
17+
# Use Data migration tool to migrate your data to Azure Cosmos DB
1818

1919
This tutorial provides instructions on using the Azure Cosmos DB Data Migration tool, which can import data from various sources into Azure Cosmos DB collections and tables. You can import from JSON files, CSV files, SQL, MongoDB, Azure Table storage, Amazon DynamoDB, and even Azure Cosmos DB SQL API collections, and you migrate that data to collections and tables for use with Azure Cosmos DB. The Data Migration tool can also be used when migrating from a single partition collection to a multi-partition collection for the SQL API.
2020

2121
Which API are you going to use with Azure Cosmos DB?
22+
2223
* **[SQL API](documentdb-introduction.md)** - You can use any of the source options provided in the Data Migration tool to import data.
2324
* **[Table API](table-introduction.md)** - You can use the Data Migration tool or AzCopy to import data. See [Import data for use with the Azure Cosmos DB Table API](table-import.md) for more information.
2425
* **[MongoDB API](mongodb-introduction.md)** - The Data Migration tool does not currently support Azure Cosmos DB MongoDB API either as a source or as a target. If you want to migrate the data in or out of MongoDB API collections in Azure Cosmos DB, refer to [Azure Cosmos DB: How to migrate data for the MongoDB API](mongodb-migrate.md) for instructions. You can still use the Data Migration tool to export data from MongoDB to Azure Cosmos DB SQL API collections for use with the SQL API.

articles/cosmos-db/mongodb-migrate.md

Lines changed: 29 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ ms.author: sclyon
1515
ms.custom: mvc
1616
---
1717

18-
# Azure Cosmos DB: Import MongoDB data
18+
# Migrate your data to Azure Cosmos DB MongoDB API account
1919

2020
To migrate data from MongoDB to an Azure Cosmos DB account for use with the API for MongoDB, you must:
2121

@@ -38,7 +38,7 @@ This tutorial covers the following tasks:
3838

3939
* Enable SSL: Azure Cosmos DB has strict security requirements and standards. Be sure to enable SSL when you interact with your account. The procedures in the rest of the article include how to enable SSL for mongoimport and mongorestore.
4040

41-
## Find your connection string information (host, port, username, and password)
41+
## Get your connection string
4242

4343
1. In the [Azure portal](https://portal.azure.com), in the left pane, click the **Azure Cosmos DB** entry.
4444
1. In the **Subscriptions** pane, select your account name.
@@ -48,43 +48,51 @@ This tutorial covers the following tasks:
4848

4949
![Connection String blade](./media/mongodb-migrate/ConnectionStringBlade.png)
5050

51-
## Import data to the API for MongoDB by using mongoimport
51+
## Migrate data by using mongoimport
5252

5353
To import data to your Azure Cosmos DB account, use the following template. Fill in *host*, *username*, and *password* with the values that are specific to your account.
5454

5555
Template:
5656

57-
mongoimport.exe --host <your_hostname>:10255 -u <your_username> -p <your_password> --db <your_database> --collection <your_collection> --ssl --sslAllowInvalidCertificates --type json --file C:\sample.json
57+
```bash
58+
mongoimport.exe --host <your_hostname>:10255 -u <your_username> -p <your_password> --db <your_database> --collection <your_collection> --ssl --sslAllowInvalidCertificates --type json --file "C:\sample.json"
59+
```
5860

5961
Example:
6062

61-
mongoimport.exe --host cosmosdb-mongodb-account.documents.azure.com:10255 -u cosmosdb-mongodb-account -p tkvaVkp4Nnaoirnouenrgisuner2435qwefBH0z256Na24frio34LNQasfaefarfernoimczciqisAXw== --ssl --sslAllowInvalidCertificates --db sampleDB --collection sampleColl --type json --file C:\Users\admin\Desktop\*.json
63+
```bash
64+
mongoimport.exe --host cosmosdb-mongodb-account.documents.azure.com:10255 -u cosmosdb-mongodb-account -p tkvaVkp4Nnaoirnouenrgisuner2435qwefBH0z256Na24frio34LNQasfaefarfernoimczciqisAXw== --ssl --sslAllowInvalidCertificates --db sampleDB --collection sampleColl --type json --file "C:\Users\admin\Desktop\*.json"
65+
```
6266

63-
## Import data to the API for MongoDB by using mongorestore
67+
## Migrate data by using mongorestore
6468

6569
To restore data to your API for MongoDB account, use the following template to execute the import. Fill in *host*, *username*, and *password* with the values that are specific to your account.
6670

6771
Template:
6872

73+
```bash
6974
mongorestore.exe --host <your_hostname>:10255 -u <your_username> -p <your_password> --db <your_database> --collection <your_collection> --ssl --sslAllowInvalidCertificates <path_to_backup>
75+
```
7076

7177
Example:
7278

79+
```bash
7380
mongorestore.exe --host cosmosdb-mongodb-account.documents.azure.com:10255 -u cosmosdb-mongodb-account -p tkvaVkp4Nnaoirnouenrgisuner2435qwefBH0z256Na24frio34LNQasfaefarfernoimczciqisAXw== --ssl --sslAllowInvalidCertificates ./dumps/dump-2016-12-07
81+
```
7482

75-
## Guide for a successful migration
83+
## Steps for a successful migration
7684

7785
1. Pre-create and scale your collections:
7886

79-
* By default, Azure Cosmos DB provisions a new MongoDB collection with 1,000 request units per second (RU/sec). Before you start the migration by using mongoimport, mongorestore, or mongomirror, pre-create all your collections from the [Azure portal](https://portal.azure.com) or from MongoDB drivers and tools. If your collection is greater than 10 GB, make sure to create a [sharded/partitioned collection](partition-data.md) with an appropriate shard key.
87+
* By default, Azure Cosmos DB provisions a new MongoDB collection with 1,000 request units per second (RU/sec). Before you start the migration by using mongoimport, mongorestore, pre-create all your collections from the [Azure portal](https://portal.azure.com) or from MongoDB drivers and tools. If the data size is greater than 10 GB, make sure to create a [sharded/partitioned collection](partition-data.md) with an appropriate shard key.
8088

81-
* From the [Azure portal](https://portal.azure.com), increase your collections' throughput from 1,000 RUs/sec for a single partition collection and 2,500 RUs/sec for a sharded collection just for the migration. With the higher throughput, you can avoid rate limiting and migrate in less time. With hourly billing in Azure Cosmos DB, you can reduce the throughput immediately after the migration to save costs.
89+
* From the [Azure portal](https://portal.azure.com), increase your collections throughput from 1000 RUs/sec for a single partition collection and 2,500 RUs/sec for a sharded collection just for the migration. With the higher throughput, you can avoid rate limiting and migrate in less time. You can reduce the throughput immediately after the migration to save costs.
8290

8391
* In addition to provisioning RUs/sec at the collection level, you may also provision RU/sec for a set of collections at the parent database level. This requires pre-creating the database and collections, as well as defining a shard key for each collection.
8492

8593
* You can create sharded collections through your favorite tool, driver, or SDK. In this example, we use the Mongo Shell to create a sharded collection:
8694

87-
```
95+
```bash
8896
db.runCommand( { shardCollection: "admin.people", key: { region: "hashed" } } )
8997
```
9098
@@ -100,15 +108,17 @@ Example:
100108
101109
1. Calculate the approximate RU charge for a single document write:
102110
103-
a. Connect to your Azure Cosmos DB MongoDB database from the MongoDB Shell. You can find instructions in [Connect a MongoDB application to Azure Cosmos DB](connect-mongodb-account.md).
104-
105-
b. Run a sample insert command by using one of your sample documents from the MongoDB Shell:
111+
a. Connect to your Azure Cosmos DB MongoDB API account from the MongoDB Shell. You can find instructions in [Connect a MongoDB application to Azure Cosmos DB](connect-mongodb-account.md).
106112
107-
```db.coll.insert({ "playerId": "a067ff", "hashedid": "bb0091", "countryCode": "hk" })```
113+
b. Run a sample insert command by using one of your sample documents from the MongoDB Shell:
114+
115+
```bash
116+
db.coll.insert({ "playerId": "a067ff", "hashedid": "bb0091", "countryCode": "hk" })
117+
```
108118
109-
c. Run ```db.runCommand({getLastRequestStatistics: 1})``` and you'll receive a response like the following:
119+
c. Run ```db.runCommand({getLastRequestStatistics: 1})``` and you'll receive a response like the following:
110120
111-
```
121+
```bash
112122
globaldb:PRIMARY> db.runCommand({getLastRequestStatistics: 1})
113123
{
114124
"_t": "GetRequestStatisticsResponse",
@@ -117,7 +127,7 @@ Example:
117127
"RequestCharge": 10,
118128
"RequestDurationInMilliSeconds": NumberLong(50)
119129
}
120-
```
130+
```
121131
122132
d. Take note of the request charge.
123133
@@ -156,12 +166,12 @@ Example:
156166
157167
1. Run the final migration command:
158168
159-
```
169+
```bash
160170
mongoimport.exe --host cosmosdb-mongodb-account.documents.azure.com:10255 -u cosmosdb-mongodb-account -p wzRJCyjtLPNuhm53yTwaefawuiefhbauwebhfuabweifbiauweb2YVdl2ZFNZNv8IU89LqFVm5U0bw== --ssl --sslAllowInvalidCertificates --jsonArray --db dabasename --collection collectionName --file "C:\sample.json" --numInsertionWorkers 4 --batchSize 24
161171
```
162172
Or with mongorestore (make sure all collections have the throughput set at or above the amount of RUs used in previous calculations):
163173
164-
```
174+
```bash
165175
mongorestore.exe --host cosmosdb-mongodb-account.documents.azure.com:10255 -u cosmosdb-mongodb-account -p wzRJCyjtLPNuhm53yTwaefawuiefhbauwebhfuabweifbiauweb2YVdl2ZFNZNv8IU89LqFVm5U0bw== --ssl --sslAllowInvalidCertificates ./dumps/dump-2016-12-07 --numInsertionWorkersPerCollection 4 --batchSize 24
166176
```
167177

articles/cosmos-db/table-import.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -14,9 +14,9 @@ ms.author: sngun
1414

1515
---
1616

17-
# Import data for use with the Azure Cosmos DB Table API
17+
# Migrate your data to Azure Cosmos DB Table API account
1818

19-
This tutorial provides instructions on importing data for use with the Azure Cosmos DB [Table API](table-introduction.md). If you have data stored in Azure Table storage, you can use either the Data Migration Tool or AzCopy to import your data. If you have data stored in an Azure Cosmos DB Table API (preview) account, you must use the Data Migration tool to migrate your data. Once your data is imported, you'll be able to take advantage of the premium capabilities Azure Cosmos DB offers, such as turnkey global distribution, dedicated throughput, single-digit millisecond latencies at the 99th percentile, guaranteed high availability, and automatic secondary indexing.
19+
This tutorial provides instructions on importing data for use with the Azure Cosmos DB [Table API](table-introduction.md). If you have data stored in Azure Table storage, you can use either the Data Migration Tool or AzCopy to import your data to Azure Cosmos DB Table API. If you have data stored in an Azure Cosmos DB Table API (preview) account, you must use the Data Migration tool to migrate your data.
2020

2121
This tutorial covers the following tasks:
2222

@@ -36,11 +36,11 @@ The command-line Azure Cosmos DB Data Migration tool (dt.exe) can be used to imp
3636
To perform a migration of table data, complete the following tasks:
3737

3838
1. Download the migration tool from [GitHub](https://github.com/azure/azure-documentdb-datamigrationtool).
39-
2. Run `dt.exe` using the command-line arguments for your scenario.
40-
41-
dt.exe takes a command in the following format:
39+
2. Run `dt.exe` using the command-line arguments for your scenario. `dt.exe` takes a command in the following format:
4240

41+
```bash
4342
dt.exe [/<option>:<value>] /s:<source-name> [/s.<source-option>:<value>] /t:<target-name> [/t.<target-option>:<value>]
43+
```
4444

4545
Options for the command are:
4646

@@ -102,7 +102,7 @@ Here is a command-line sample to import from Table API preview to Table API GA:
102102
dt /s:AzureTable /s.ConnectionString:DefaultEndpointsProtocol=https;AccountName=<Table API preview account name>;AccountKey=<Table API preview account key>;TableEndpoint=https://<Account Name>.documents.azure.com; /s.Table:<Table name> /t:TableAPIBulk /t.ConnectionString:DefaultEndpointsProtocol=https;AccountName=<Azure Cosmos DB account name>;AccountKey=<Azure Cosmos DB account key>;TableEndpoint=https://<Account name>.table.cosmosdb.azure.com:443 /t.TableName:<Table name> /t.Overwrite
103103
```
104104

105-
## AzCopy command
105+
## Migrate data by using AzCopy
106106

107107
Using the AzCopy command-line utility is the other option for migrating data from Azure Table storage to the Azure Cosmos DB Table API. To use AzCopy, you first export your data as described in [Export data from Table storage](../storage/common/storage-use-azcopy.md#export-data-from-table-storage), then import the data to Azure Cosmos DB as described in [Azure Cosmos DB Table API](../storage/common/storage-use-azcopy.md#import-data-into-table-storage).
108108

0 commit comments

Comments
 (0)