Skip to content

Commit 09e420c

Browse files
committed
Merge branch 'main' of https://github.com/crate/cloud-docs
2 parents db9122d + 2e7605e commit 09e420c

File tree

2 files changed

+30
-29
lines changed

2 files changed

+30
-29
lines changed

docs/cluster/import.md

Lines changed: 29 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -2,19 +2,21 @@
22
# Import
33

44
The first thing you see in the "Import" tab is the history of your
5-
imports. You can see whether you imported from a URL or from a file,
6-
file name, table into which you imported, date, and status. By clicking
7-
"Show details" you can display details of a particular import.
5+
import jobs. You can see whether you imported from a URL or from a file,
6+
the source file name and the target table name, and other metadata
7+
like date and status.
8+
By navigating to "Show details", you can display details of a particular
9+
import job.
810

9-
Clicking the "Import new data" button will bring up the Import page,
11+
Clicking the "Import new data" button will bring up the page
1012
where you can select the source of your data.
1113

1214
If you don't have a dataset prepared, we also provide an example in the
1315
URL import section. It's the New York City taxi trip dataset for July
1416
of 2019 (about 6.3M records).
1517

1618
(cluster-import-url)=
17-
## Import from URL
19+
## URL
1820

1921
To import data, fill out the URL, name of the table which will be
2022
created and populated with your data, data format, and whether it is
@@ -34,7 +36,7 @@ Gzip compressed files are also supported.
3436
![Cloud Console cluster upload from URL](../_assets/img/cluster-import-tab-url.png)
3537

3638
(cluster-import-s3)=
37-
## Import from private S3 bucket
39+
## S3 bucket
3840

3941
CrateDB Cloud allows convenient imports directly from S3-compatible
4042
storage. To import a file form bucket, provide the name of your bucket,
@@ -48,45 +50,46 @@ imports. The usual file formats are supported - CSV (all variants), JSON
4850
![Cloud Console cluster upload from S3](../_assets/img/cluster-import-tab-s3.png)
4951

5052
:::{note}
51-
It's important to make sure that you have the right permissions to
53+
It is important to make sure that you have the right permissions to
5254
access objects in the specified bucket. For AWS S3, your user should
5355
have a policy that allows GetObject access, for example:
5456

55-
:::{code}
56-
{
57-
"Version": "2012-10-17",
58-
"Statement": [
59-
{
60-
"Sid": "AllowGetObject",
61-
"Effect": "Allow",
62-
"Principal": {
63-
"AWS": "*"
64-
},
65-
"Action": "s3:GetObject",
66-
"Resource": "arn:aws:s3:::EXAMPLE-BUCKET-NAME/*"
67-
}]
68-
}
69-
:::
57+
```json
58+
{
59+
"Version": "2012-10-17",
60+
"Statement": [
61+
{
62+
"Sid": "AllowGetObject",
63+
"Effect": "Allow",
64+
"Principal": {
65+
"AWS": "*"
66+
},
67+
"Action": "s3:GetObject",
68+
"Resource": "arn:aws:s3:::EXAMPLE-BUCKET-NAME/*"
69+
}
70+
]
71+
}
72+
```
7073
:::
7174

7275
(cluster-import-azure)=
73-
## Import from Azure Blob Storage Container
76+
## Azure Blob Storage
7477

7578
Importing data from private Azure Blob Storage containers is possible
7679
using a stored secret, which includes a secret name and either an Azure
7780
Storage Connection string or an Azure SAS Token URL. An admin user at
7881
the organization level can add this secret.
7982

8083
You can specify a secret, a container, a table and a path in the form
81-
[/folder/my_file.parquet]
84+
`/folder/my_file.parquet`.
8285

8386
As with other imports Parquet, CSV, and JSON files are supported. File
8487
size limitation for imports is 10 GiB per file.
8588

8689
![Cloud Console cluster upload from Azure Storage Container](../_assets/img/cluster-import-tab-azure.png)
8790

8891
(cluster-import-globbing)=
89-
## Importing multiple files
92+
## Globbing
9093

9194
Importing multiple files, also known as import globbing is supported in
9295
any s3-compatible blob storage. The steps are the same as if importing
@@ -112,7 +115,7 @@ As with other imports, the supported file types are CSV, JSON, and
112115
Parquet.
113116

114117
(cluster-import-file)=
115-
## Import from file
118+
## File
116119

117120
Uploading directly from your computer offers more control over your
118121
data. From the security point of view, you don't have to share the data

docs/reference/glossary.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -72,9 +72,7 @@ and queries per second. Only actual cluster usage is billed.
7272

7373
A cluster has a name, a unique ID, as well as a storage and processing
7474
capacity and a number of nodes. Note that clusters are also versioned.
75-
For information on how to deploy a cluster, please see the [tutorial for
76-
deploying a CrateDB Cloud cluster from
77-
scratch](https://cratedb.com/docs/cloud/en/latest/tutorials/quick-start.html).
75+
You can deploy a free cluster in [CrateDB Cloud](https://console.cratedb.cloud).
7876

7977
(gloss-console)=
8078
## Console

0 commit comments

Comments
 (0)