You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/cosmos-db/cassandra-import-data.md
+11-9Lines changed: 11 additions & 9 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -13,9 +13,9 @@ ms.date: 11/15/2017
13
13
ms.author: govindk
14
14
ms.custom: mvc
15
15
---
16
-
# Azure Cosmos DB: Import Cassandra data
16
+
# Migrate your data to Azure Cosmos DBCassandra API account
17
17
18
-
This tutorial provides instructions on importing Cassandra data into Azure Cosmos DB using the Cassandra Query Language (CQL) COPY command.
18
+
This tutorial provides instructions on importing Cassandra data into Azure Cosmos DB by using the Cassandra Query Language (CQL) COPY command.
19
19
20
20
This tutorial covers the following tasks:
21
21
@@ -26,11 +26,13 @@ This tutorial covers the following tasks:
26
26
27
27
# Prerequisites
28
28
29
-
* Install [Apache Cassandra](http://cassandra.apache.org/download/) and specifically ensure *cqlsh* is present.
30
-
* Increase throughput: The duration of your data migration depends on the amount of throughput you provisioned for your Tables. Be sure to increase the throughput for larger data migrations. After you've completed the migration, decrease the throughput to save costs. For more information about increasing throughput in the [Azure portal](https://portal.azure.com), see [Set throughput for Azure Cosmos DB containers](set-throughput.md).
29
+
* Install [Apache Cassandra](http://cassandra.apache.org/download/) and specifically ensure *cqlsh* is present.
30
+
31
+
* Increase throughput: The duration of your data migration depends on the amount of throughput you provisioned for your Tables. Be sure to increase the throughput for larger data migrations. After you've completed the migration, decrease the throughput to save costs. For more information about increasing throughput in the [Azure portal](https://portal.azure.com), see [Set throughput for Azure Cosmos DB containers](set-throughput.md).
32
+
31
33
* Enable SSL: Azure Cosmos DB has strict security requirements and standards. Be sure to enable SSL when you interact with your account. When you use CQL with SSH, you have an option to provide SSL information.
32
34
33
-
## Find your connection string
35
+
## Get your connection string
34
36
35
37
1. In the [Azure portal](https://portal.azure.com), on the far left, click **Azure Cosmos DB**.
36
38
@@ -40,14 +42,14 @@ This tutorial covers the following tasks:
To import Cassandra data into Azure Cosmos DB for use with the Cassandra API, use the following guidance:
46
48
47
49
1. Log in to cqhsh using the connection information from the portal.
48
50
2. Use the [CQL COPY command](http://cassandra.apache.org/doc/latest/tools/cqlsh.html#cqlsh) to copy local data to the Apache Cassandra API endpoint. Ensure the source and target are in same datacenter to minimize latency issues.
49
51
50
-
### Guide for moving data with cqlsh
52
+
### Steps to move data with cqlsh
51
53
52
54
1. Pre-create and scale your table:
53
55
* By default, Azure Cosmos DB provisions a new Cassandra API table with 1,000 request units per second (RU/s) (CQL-based creation is provisioned with 400 RU/s). Before you start the migration by using cqlsh, pre-create all your tables from the [Azure portal](https://portal.azure.com) or from cqlsh.
@@ -75,11 +77,11 @@ To import Cassandra data into Azure Cosmos DB for use with the Cassandra API, us
For data residing in an existing cluster in Azure virtual machines, importing data using Spark is also feasible option. This requires Spark to be set up as intermediary for one time or regular ingestion.
Copy file name to clipboardExpand all lines: articles/cosmos-db/import-data.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,11 +14,12 @@ ms.date: 03/30/2018
14
14
ms.author: dech
15
15
ms.custom: mvc
16
16
---
17
-
# Azure Cosmos DB: Data migration tool
17
+
# Use Data migration tool to migrate your data to Azure Cosmos DB
18
18
19
19
This tutorial provides instructions on using the Azure Cosmos DB Data Migration tool, which can import data from various sources into Azure Cosmos DB collections and tables. You can import from JSON files, CSV files, SQL, MongoDB, Azure Table storage, Amazon DynamoDB, and even Azure Cosmos DB SQL API collections, and you migrate that data to collections and tables for use with Azure Cosmos DB. The Data Migration tool can also be used when migrating from a single partition collection to a multi-partition collection for the SQL API.
20
20
21
21
Which API are you going to use with Azure Cosmos DB?
22
+
22
23
***[SQL API](documentdb-introduction.md)** - You can use any of the source options provided in the Data Migration tool to import data.
23
24
***[Table API](table-introduction.md)** - You can use the Data Migration tool or AzCopy to import data. See [Import data for use with the Azure Cosmos DB Table API](table-import.md) for more information.
24
25
***[MongoDB API](mongodb-introduction.md)** - The Data Migration tool does not currently support Azure Cosmos DB MongoDB API either as a source or as a target. If you want to migrate the data in or out of MongoDB API collections in Azure Cosmos DB, refer to [Azure Cosmos DB: How to migrate data for the MongoDB API](mongodb-migrate.md) for instructions. You can still use the Data Migration tool to export data from MongoDB to Azure Cosmos DB SQL API collections for use with the SQL API.
Copy file name to clipboardExpand all lines: articles/cosmos-db/mongodb-migrate.md
+29-19Lines changed: 29 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -15,7 +15,7 @@ ms.author: sclyon
15
15
ms.custom: mvc
16
16
---
17
17
18
-
# Azure Cosmos DB: Import MongoDB data
18
+
# Migrate your data to Azure Cosmos DBMongoDB API account
19
19
20
20
To migrate data from MongoDB to an Azure Cosmos DB account for use with the API for MongoDB, you must:
21
21
@@ -38,7 +38,7 @@ This tutorial covers the following tasks:
38
38
39
39
* Enable SSL: Azure Cosmos DB has strict security requirements and standards. Be sure to enable SSL when you interact with your account. The procedures in the rest of the article include how to enable SSL for mongoimport and mongorestore.
40
40
41
-
## Find your connection string information (host, port, username, and password)
41
+
## Get your connection string
42
42
43
43
1. In the [Azure portal](https://portal.azure.com), in the left pane, click the **Azure Cosmos DB** entry.
44
44
1. In the **Subscriptions** pane, select your account name.
@@ -48,43 +48,51 @@ This tutorial covers the following tasks:
## Import data to the API for MongoDB by using mongoimport
51
+
## Migrate data by using mongoimport
52
52
53
53
To import data to your Azure Cosmos DB account, use the following template. Fill in *host*, *username*, and *password* with the values that are specific to your account.
## Import data to the API for MongoDB by using mongorestore
67
+
## Migrate data by using mongorestore
64
68
65
69
To restore data to your API for MongoDB account, use the following template to execute the import. Fill in *host*, *username*, and *password* with the values that are specific to your account.
* By default, Azure Cosmos DB provisions a new MongoDB collection with 1,000 request units per second (RU/sec). Before you start the migration by using mongoimport, mongorestore, or mongomirror, pre-create all your collections from the [Azure portal](https://portal.azure.com) or from MongoDB drivers and tools. If your collection is greater than 10 GB, make sure to create a [sharded/partitioned collection](partition-data.md) with an appropriate shard key.
87
+
* By default, Azure Cosmos DB provisions a new MongoDB collection with 1,000 request units per second (RU/sec). Before you start the migration by using mongoimport, mongorestore, pre-create all your collections from the [Azure portal](https://portal.azure.com) or from MongoDB drivers and tools. If the data size is greater than 10 GB, make sure to create a [sharded/partitioned collection](partition-data.md) with an appropriate shard key.
80
88
81
-
* From the [Azure portal](https://portal.azure.com), increase your collections' throughput from 1,000 RUs/sec for a single partition collection and 2,500 RUs/sec for a sharded collection just for the migration. With the higher throughput, you can avoid rate limiting and migrate in less time. With hourly billing in Azure Cosmos DB, you can reduce the throughput immediately after the migration to save costs.
89
+
* From the [Azure portal](https://portal.azure.com), increase your collections throughput from 1000 RUs/sec for a single partition collection and 2,500 RUs/sec for a sharded collection just for the migration. With the higher throughput, you can avoid rate limiting and migrate in less time. You can reduce the throughput immediately after the migration to save costs.
82
90
83
91
* In addition to provisioning RUs/sec at the collection level, you may also provision RU/sec for a set of collections at the parent database level. This requires pre-creating the database and collections, as well as defining a shard key for each collection.
84
92
85
93
* You can create sharded collections through your favorite tool, driver, or SDK. In this example, we use the Mongo Shell to create a sharded collection:
1. Calculate the approximate RU charge for a single document write:
102
110
103
-
a. Connect to your Azure Cosmos DB MongoDB database from the MongoDB Shell. You can find instructions in [Connect a MongoDB application to Azure Cosmos DB](connect-mongodb-account.md).
104
-
105
-
b. Run a sample insert command by using one of your sample documents from the MongoDB Shell:
111
+
a. Connect to your Azure Cosmos DB MongoDB API account from the MongoDB Shell. You can find instructions in [Connect a MongoDB application to Azure Cosmos DB](connect-mongodb-account.md).
Copy file name to clipboardExpand all lines: articles/cosmos-db/table-import.md
+6-6Lines changed: 6 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,9 +14,9 @@ ms.author: sngun
14
14
15
15
---
16
16
17
-
# Import data for use with the Azure Cosmos DB Table API
17
+
# Migrate your data to Azure Cosmos DB Table API account
18
18
19
-
This tutorial provides instructions on importing data for use with the Azure Cosmos DB [Table API](table-introduction.md). If you have data stored in Azure Table storage, you can use either the Data Migration Tool or AzCopy to import your data. If you have data stored in an Azure Cosmos DB Table API (preview) account, you must use the Data Migration tool to migrate your data. Once your data is imported, you'll be able to take advantage of the premium capabilities Azure Cosmos DB offers, such as turnkey global distribution, dedicated throughput, single-digit millisecond latencies at the 99th percentile, guaranteed high availability, and automatic secondary indexing.
19
+
This tutorial provides instructions on importing data for use with the Azure Cosmos DB [Table API](table-introduction.md). If you have data stored in Azure Table storage, you can use either the Data Migration Tool or AzCopy to import your data to Azure Cosmos DB Table API. If you have data stored in an Azure Cosmos DB Table API (preview) account, you must use the Data Migration tool to migrate your data.
20
20
21
21
This tutorial covers the following tasks:
22
22
@@ -36,11 +36,11 @@ The command-line Azure Cosmos DB Data Migration tool (dt.exe) can be used to imp
36
36
To perform a migration of table data, complete the following tasks:
37
37
38
38
1. Download the migration tool from [GitHub](https://github.com/azure/azure-documentdb-datamigrationtool).
39
-
2. Run `dt.exe` using the command-line arguments for your scenario.
40
-
41
-
dt.exe takes a command in the following format:
39
+
2. Run `dt.exe` using the command-line arguments for your scenario. `dt.exe` takes a command in the following format:
@@ -102,7 +102,7 @@ Here is a command-line sample to import from Table API preview to Table API GA:
102
102
dt /s:AzureTable /s.ConnectionString:DefaultEndpointsProtocol=https;AccountName=<Table API preview account name>;AccountKey=<Table API preview account key>;TableEndpoint=https://<Account Name>.documents.azure.com; /s.Table:<Table name> /t:TableAPIBulk /t.ConnectionString:DefaultEndpointsProtocol=https;AccountName=<Azure Cosmos DB account name>;AccountKey=<Azure Cosmos DB account key>;TableEndpoint=https://<Account name>.table.cosmosdb.azure.com:443 /t.TableName:<Table name> /t.Overwrite
103
103
```
104
104
105
-
## AzCopy command
105
+
## Migrate data by using AzCopy
106
106
107
107
Using the AzCopy command-line utility is the other option for migrating data from Azure Table storage to the Azure Cosmos DB Table API. To use AzCopy, you first export your data as described in [Export data from Table storage](../storage/common/storage-use-azcopy.md#export-data-from-table-storage), then import the data to Azure Cosmos DB as described in [Azure Cosmos DB Table API](../storage/common/storage-use-azcopy.md#import-data-into-table-storage).
0 commit comments