Skip to content

Commit 4edfcde

Browse files
author
github-actions
committed
update MD by dispatch event pingcap/docs release-cloud
1 parent 8b87be3 commit 4edfcde

10 files changed

+52
-22
lines changed

markdown-pages/en/tidbcloud/master/develop/dev-guide-bookshop-schema-design.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -99,9 +99,7 @@ You can delete the original table structure through the `--drop-tables` paramete
9999
100100
2. Click the name of your target cluster to go to its overview page, and then click **Import** in the left navigation pane.
101101

102-
2. Select **Import data from S3**.
103-
104-
If this is your first time using TiDB Cloud Import, select **Import From Amazon S3**.
102+
2. Select **Import data from Cloud Storage**, and then click **Amazon S3**.
105103

106104
3. On the **Import Data from Amazon S3** page, configure the following source data information:
107105

markdown-pages/en/tidbcloud/master/develop/dev-guide-insert-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -244,7 +244,7 @@ The following are the recommended tools for bulk-insert:
244244

245245
<CustomContent platform="tidb-cloud">
246246

247-
- Data import: [Create Import](/tidb-cloud/import-sample-data-serverless.md) page in the [TiDB Cloud console](https://console.tidb.io/signup?provider_source=alicloud). You can import **Dumpling** exported data, import a local **CSV** file, or [Import CSV Files from Amazon S3, GCS, or Azure Blob Storage into TiDB Cloud Starter](/tidb-cloud/import-csv-files-serverless.md). It also supports reading data from a local disk, Amazon S3 cloud disk, or GCS cloud disk.
247+
- Data import: [Create Import](/tidb-cloud/import-sample-data-serverless.md) page in the [TiDB Cloud console](https://console.tidb.io/signup?provider_source=alicloud). You can upload local CSV files, and import **Dumpling** logical dumps (schema and data), **CSV**, or **Parquet** files stored in cloud storage. For more information, see [Import CSV Files from Cloud Storage into TiDB Cloud Starter](/tidb-cloud/import-csv-files-serverless.md) and [Import Apache Parquet Files from Cloud Storage into TiDB Cloud Starter](/tidb-cloud/import-parquet-files-serverless.md).
248248
- Data replication: [TiDB Data Migration](https://docs.pingcap.com/tidb/stable/dm-overview). You can replicate MySQL, MariaDB, and Amazon Aurora databases to TiDB. It also supports merging and migrating the sharded instances and tables from the source databases.
249249

250250
</CustomContent>

markdown-pages/en/tidbcloud/master/external-storage-uri.md

Lines changed: 36 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,8 @@ The basic format of the URI is as follows:
1515

1616
## Amazon S3 URI format
1717

18+
<CustomContent platform="tidb">
19+
1820
- `scheme`: `s3`
1921
- `host`: `bucket name`
2022
- `parameters`:
@@ -48,12 +50,42 @@ tiup cdc:v7.5.0 cli changefeed create \
4850
--config=cdc_csv.toml
4951
```
5052

53+
</CustomContent>
54+
55+
<CustomContent platform="tidb-cloud">
56+
57+
- `scheme`: `s3`
58+
- `host`: `bucket name`
59+
- `parameters`:
60+
61+
- `access-key`: Specifies the access key.
62+
- `secret-access-key`: Specifies the secret access key.
63+
- `session-token`: Specifies the temporary session token.
64+
- `use-accelerate-endpoint`: Specifies whether to use the accelerate endpoint on Amazon S3 (defaults to `false`).
65+
- `endpoint`: Specifies the URL of custom endpoint for S3-compatible services (for example, `<https://s3.example.com/>`).
66+
- `force-path-style`: Use path style access rather than virtual hosted style access (defaults to `true`).
67+
- `storage-class`: Specifies the storage class of the uploaded objects (for example, `STANDARD` or `STANDARD_IA`).
68+
- `sse`: Specifies the server-side encryption algorithm used to encrypt the uploaded objects (value options: empty, `AES256`, or `aws:kms`).
69+
- `sse-kms-key-id`: Specifies the KMS ID if `sse` is set to `aws:kms`.
70+
- `acl`: Specifies the canned ACL of the uploaded objects (for example, `private` or `authenticated-read`).
71+
- `role-arn`: To allow TiDB Cloud to access Amazon S3 data using a specific [IAM role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html), provide the role's [Amazon Resource Name (ARN)](https://docs.aws.amazon.com/general/latest/gr/aws-arns-and-namespaces.html) in the `role-arn` URL query parameter. For example: `arn:aws:iam::888888888888:role/my-role`.
72+
73+
> **Note:**
74+
>
75+
> - To automatically create an IAM role, navigate to the **Import Data from Amazon S3** page of your cluster in the [TiDB Cloud console](https://console.tidb.io/signup?provider_source=alicloud), fill in the **Folder URI** field, click **Click here to create new one with AWS CloudFormation** under the **Role ARN** field, and then follow the on-screen instructions in the **Add New Role ARN** dialog.
76+
> - If you have any trouble creating the IAM role using AWS CloudFormation, click **click Having trouble? Create Role ARN manually** in the **Add New Role ARN** dialog to get the TiDB Cloud Account ID and TiDB Cloud External ID, and then follow the steps in [Configure Amazon S3 access using a Role ARN](/tidb-cloud/serverless-external-storage.md#configure-amazon-s3-access-using-a-role-arn) to create the role manually. When configuring the IAM role, make sure to enter the TiDB Cloud account ID in the **Account ID** field and select **Require external ID** to protect against [confused deputy attacks](https://docs.aws.amazon.com/IAM/latest/UserGuide/confused-deputy.html).
77+
> - To enhance security, you can reduce the valid duration of the IAM role by configuring a shorter **Max session duration**. For more information, see [Update the maximum session duration for a role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_update-role-settings.html#id_roles_update-session-duration) in AWS documentation.
78+
79+
- `external-id`: Specifies the TiDB Cloud External ID, which is required for TiDB Cloud to access Amazon S3 data. You can obtain this ID from the **Add New Role ARN** dialog in the [TiDB Cloud console](https://console.tidb.io/signup?provider_source=alicloud). For more information, see [Configure Amazon S3 access using a Role ARN](/tidb-cloud/serverless-external-storage.md#configure-amazon-s3-access-using-a-role-arn).
80+
5181
The following is an example of an Amazon S3 URI for [`IMPORT INTO`](/sql-statements/sql-statement-import-into.md). In this example, you need to specify a specific filename `test.csv`.
5282

5383
```shell
5484
s3://external/test.csv?access-key=${access-key}&secret-access-key=${secret-access-key}
5585
```
5686

87+
</CustomContent>
88+
5789
## GCS URI format
5890

5991
- `scheme`: `gcs` or `gs`
@@ -64,12 +96,16 @@ s3://external/test.csv?access-key=${access-key}&secret-access-key=${secret-acces
6496
- `storage-class`: Specifies the storage class of the uploaded objects (for example, `STANDARD` or `COLDLINE`)
6597
- `predefined-acl`: Specifies the predefined ACL of the uploaded objects (for example, `private` or `project-private`)
6698

99+
<CustomContent platform="tidb">
100+
67101
The following is an example of a GCS URI for TiDB Lightning and BR. In this example, you need to specify a specific file path `testfolder`.
68102

69103
```shell
70104
gcs://external/testfolder?credentials-file=${credentials-file-path}
71105
```
72106

107+
</CustomContent>
108+
73109
The following is an example of a GCS URI for [`IMPORT INTO`](/sql-statements/sql-statement-import-into.md). In this example, you need to specify a specific filename `test.csv`.
74110

75111
```shell

markdown-pages/en/tidbcloud/master/tidb-cloud/csv-config-for-import-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ summary: Learn how to use CSV configurations for the Import Data service on TiDB
77

88
This document introduces CSV configurations for the Import Data service on TiDB Cloud.
99

10-
The following is the CSV Configuration window when you use the Import Data service on TiDB Cloud to import CSV files. For more information, see [Import CSV Files from Amazon S3, GCS, or Azure Blob Storage into TiDB Cloud Starter](/tidb-cloud/import-csv-files-serverless.md).
10+
The following is the CSV Configuration window when you use the Import Data service on TiDB Cloud to import CSV files. For more information, see [Import CSV Files from Cloud Storage into TiDB Cloud Starter](/tidb-cloud/import-csv-files-serverless.md).
1111

1212
![CSV Configurations](https://docs-download.pingcap.com/media/images/docs/tidb-cloud/import-data-csv-config.png)
1313

markdown-pages/en/tidbcloud/master/tidb-cloud/import-csv-files-serverless.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
---
2-
title: Import CSV Files from Amazon S3, GCS, Azure Blob Storage, or Alibaba Cloud OSS into TiDB Cloud Starter
2+
title: Import CSV Files from Cloud Storage into TiDB Cloud Starter
33
summary: Learn how to import CSV files from Amazon S3, GCS, Azure Blob Storage, or Alibaba Cloud Object Storage Service (OSS) into TiDB Cloud Starter.
44
---
55

6-
# Import CSV Files from Amazon S3, GCS, Azure Blob Storage, or Alibaba Cloud OSS into TiDB Cloud Starter
6+
# Import CSV Files from Cloud Storage into TiDB Cloud Starter
77

88
This document describes how to import CSV files from Amazon Simple Storage Service (Amazon S3), Google Cloud Storage (GCS), Azure Blob Storage, or Alibaba Cloud Object Storage Service (OSS) into TiDB Cloud Starter.
99

markdown-pages/en/tidbcloud/master/tidb-cloud/import-parquet-files-serverless.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,9 +1,9 @@
11
---
2-
title: Import Apache Parquet Files from Amazon S3, GCS, Azure Blob Storage, or Alibaba Cloud OSS into TiDB Cloud Starter
2+
title: Import Apache Parquet Files from Cloud Storage into TiDB Cloud Starter
33
summary: Learn how to import Apache Parquet files from Amazon S3, GCS, Azure Blob Storage, or Alibaba Cloud Object Storage Service (OSS) into TiDB Cloud Starter.
44
---
55

6-
# Import Apache Parquet Files from Amazon S3, GCS, Azure Blob Storage, or Alibaba Cloud OSS into TiDB Cloud Starter
6+
# Import Apache Parquet Files from Cloud Storage into TiDB Cloud Starter
77

88
You can import both uncompressed and Snappy compressed [Apache Parquet](https://parquet.apache.org/) format data files to TiDB Cloud Starter. This document describes how to import Parquet files from Amazon Simple Storage Service (Amazon S3), Google Cloud Storage (GCS), Azure Blob Storage, or Alibaba Cloud Object Storage Service (OSS) into TiDB Cloud Starter.
99

markdown-pages/en/tidbcloud/master/tidb-cloud/import-sample-data-serverless.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,9 +21,7 @@ This document describes how to import the sample data into TiDB Cloud Starter vi
2121
2222
2. Click the name of your target cluster to go to its overview page, and then click **Data** > **Import** in the left navigation pane.
2323

24-
2. Select **Import data from S3**.
25-
26-
If this is your first time importing data into this cluster, select **Import From Amazon S3**.
24+
2. Select **Import data from Cloud Storage**, and then click **Amazon S3**.
2725

2826
3. On the **Import Data from Amazon S3** page, configure the following source data information:
2927

markdown-pages/en/tidbcloud/master/tidb-cloud/import-snapshot-files.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@ summary: Learn how to import Amazon Aurora or RDS for MySQL snapshot files into
77

88
You can import Amazon Aurora or RDS for MySQL snapshot files into TiDB Cloud. Note that all source data files with the `.parquet` suffix in the `{db_name}.{table_name}/` folder must conform to the [naming convention](/tidb-cloud/naming-conventions-for-data-import.md).
99

10-
The process of importing snapshot files is similar to that of importing Parquet files. For more information, see [Import Apache Parquet Files from Amazon S3, GCS, or Azure Blob Storage into TiDB Cloud Starter](/tidb-cloud/import-parquet-files-serverless.md).
10+
The process of importing snapshot files is similar to that of importing Parquet files. For more information, see [Import Apache Parquet Files from Cloud Storage into TiDB Cloud Starter](/tidb-cloud/import-parquet-files-serverless.md).

markdown-pages/en/tidbcloud/master/tidb-cloud/migrate-sql-shards.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -185,9 +185,7 @@ After configuring the Amazon S3 access, you can perform the data import task in
185185

186186
2. Click the name of your target cluster to go to its overview page, and then click **Data** > **Import** in the left navigation pane.
187187

188-
2. Select **Import data from S3**.
189-
190-
If this is your first time importing data into this cluster, select **Import From Amazon S3**.
188+
2. Select **Import data from Cloud Storage**, and then click **Amazon S3**.
191189

192190
3. On the **Import Data from Amazon S3** page, fill in the following information:
193191

markdown-pages/en/tidbcloud/master/tidb-cloud/tidb-cloud-migration-overview.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -33,19 +33,19 @@ If you have data files in SQL, CSV, Parquet, or Aurora Snapshot formats, you can
3333

3434
You can import sample data (SQL file) to TiDB Cloud to quickly get familiar with the TiDB Cloud interface and the import process. For more information, see [Import Sample Data to TiDB Cloud](/tidb-cloud/import-sample-data-serverless.md).
3535

36-
- Import CSV files from Amazon S3 or GCS into TiDB Cloud
36+
- Import CSV files from cloud storage into TiDB Cloud
3737

38-
You can import CSV files from Amazon S3 or GCS into TiDB Cloud. For more information, see [Import CSV Files from Amazon S3, GCS, or Azure Blob Storage into TiDB Cloud Starter](/tidb-cloud/import-csv-files-serverless.md).
38+
You can import CSV files from cloud storage such as Amazon S3 into TiDB Cloud. For more information, see [Import CSV Files from Cloud Storage into TiDB Cloud Starter](/tidb-cloud/import-csv-files-serverless.md).
3939

40-
- Import Apache Parquet files from Amazon S3 or GCS into TiDB Cloud
40+
- Import Apache Parquet files from cloud storage into TiDB Cloud
4141

42-
You can import Parquet files from Amazon S3 or GCS into TiDB Cloud. For more information, see [Import Apache Parquet Files from Amazon S3 or GCS into TiDB Cloud](/tidb-cloud/import-parquet-files-serverless.md).
42+
You can import Parquet files from cloud storage such as Amazon S3 into TiDB Cloud. For more information, see [Import Apache Parquet Files from Cloud Storage into TiDB Cloud Starter](/tidb-cloud/import-parquet-files-serverless.md).
4343

4444
## Reference
4545

46-
### Configure Amazon S3 access and GCS access
46+
### Configure cloud storage access
4747

48-
If your source data is stored in Amazon S3 or Google Cloud Storage (GCS) buckets, before importing or migrating the data to TiDB Cloud, you need to configure access to the buckets. For more information, see [Configure Amazon S3 access and GCS access](/tidb-cloud/serverless-external-storage.md).
48+
If your source data is stored in cloud storage, before importing or migrating the data to TiDB Cloud, you need to configure access to the storage. For more information, see [Configure cloud storage access](/tidb-cloud/serverless-external-storage.md).
4949

5050
### Naming conventions for data import
5151

0 commit comments

Comments
 (0)