Skip to content

Commit 9f984ec

Browse files
committed
Revert "Merge branch 'main' of https://github.com/crate/cloud-docs"
This reverts commit 09e420c, reversing changes made to db9122d.
1 parent 09e420c commit 9f984ec

File tree

2 files changed

+29
-30
lines changed

2 files changed

+29
-30
lines changed

docs/cluster/import.md

Lines changed: 26 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -2,21 +2,19 @@
22
# Import
33

44
The first thing you see in the "Import" tab is the history of your
5-
import jobs. You can see whether you imported from a URL or from a file,
6-
the source file name and the target table name, and other metadata
7-
like date and status.
8-
By navigating to "Show details", you can display details of a particular
9-
import job.
5+
imports. You can see whether you imported from a URL or from a file,
6+
file name, table into which you imported, date, and status. By clicking
7+
"Show details" you can display details of a particular import.
108

11-
Clicking the "Import new data" button will bring up the page
9+
Clicking the "Import new data" button will bring up the Import page,
1210
where you can select the source of your data.
1311

1412
If you don't have a dataset prepared, we also provide an example in the
1513
URL import section. It's the New York City taxi trip dataset for July
1614
of 2019 (about 6.3M records).
1715

1816
(cluster-import-url)=
19-
## URL
17+
## Import from URL
2018

2119
To import data, fill out the URL, name of the table which will be
2220
created and populated with your data, data format, and whether it is
@@ -36,7 +34,7 @@ Gzip compressed files are also supported.
3634
![Cloud Console cluster upload from URL](../_assets/img/cluster-import-tab-url.png)
3735

3836
(cluster-import-s3)=
39-
## S3 bucket
37+
## Import from private S3 bucket
4038

4139
CrateDB Cloud allows convenient imports directly from S3-compatible
4240
storage. To import a file form bucket, provide the name of your bucket,
@@ -50,46 +48,45 @@ imports. The usual file formats are supported - CSV (all variants), JSON
5048
![Cloud Console cluster upload from S3](../_assets/img/cluster-import-tab-s3.png)
5149

5250
:::{note}
53-
It is important to make sure that you have the right permissions to
51+
It's important to make sure that you have the right permissions to
5452
access objects in the specified bucket. For AWS S3, your user should
5553
have a policy that allows GetObject access, for example:
5654

57-
```json
58-
{
59-
"Version": "2012-10-17",
60-
"Statement": [
61-
{
62-
"Sid": "AllowGetObject",
63-
"Effect": "Allow",
64-
"Principal": {
65-
"AWS": "*"
66-
},
67-
"Action": "s3:GetObject",
68-
"Resource": "arn:aws:s3:::EXAMPLE-BUCKET-NAME/*"
69-
}
70-
]
71-
}
72-
```
55+
:::{code}
56+
{
57+
"Version": "2012-10-17",
58+
"Statement": [
59+
{
60+
"Sid": "AllowGetObject",
61+
"Effect": "Allow",
62+
"Principal": {
63+
"AWS": "*"
64+
},
65+
"Action": "s3:GetObject",
66+
"Resource": "arn:aws:s3:::EXAMPLE-BUCKET-NAME/*"
67+
}]
68+
}
69+
:::
7370
:::
7471

7572
(cluster-import-azure)=
76-
## Azure Blob Storage
73+
## Import from Azure Blob Storage Container
7774

7875
Importing data from private Azure Blob Storage containers is possible
7976
using a stored secret, which includes a secret name and either an Azure
8077
Storage Connection string or an Azure SAS Token URL. An admin user at
8178
the organization level can add this secret.
8279

8380
You can specify a secret, a container, a table and a path in the form
84-
`/folder/my_file.parquet`.
81+
[/folder/my_file.parquet]
8582

8683
As with other imports Parquet, CSV, and JSON files are supported. File
8784
size limitation for imports is 10 GiB per file.
8885

8986
![Cloud Console cluster upload from Azure Storage Container](../_assets/img/cluster-import-tab-azure.png)
9087

9188
(cluster-import-globbing)=
92-
## Globbing
89+
## Importing multiple files
9390

9491
Importing multiple files, also known as import globbing is supported in
9592
any s3-compatible blob storage. The steps are the same as if importing
@@ -115,7 +112,7 @@ As with other imports, the supported file types are CSV, JSON, and
115112
Parquet.
116113

117114
(cluster-import-file)=
118-
## File
115+
## Import from file
119116

120117
Uploading directly from your computer offers more control over your
121118
data. From the security point of view, you don't have to share the data

docs/reference/glossary.md

Lines changed: 3 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -72,7 +72,9 @@ and queries per second. Only actual cluster usage is billed.
7272

7373
A cluster has a name, a unique ID, as well as a storage and processing
7474
capacity and a number of nodes. Note that clusters are also versioned.
75-
You can deploy a free cluster in [CrateDB Cloud](https://console.cratedb.cloud).
75+
For information on how to deploy a cluster, please see the [tutorial for
76+
deploying a CrateDB Cloud cluster from
77+
scratch](https://cratedb.com/docs/cloud/en/latest/tutorials/quick-start.html).
7678

7779
(gloss-console)=
7880
## Console

0 commit comments

Comments
 (0)