22# Import
33
44The first thing you see in the "Import" tab is the history of your
5- imports. You can see whether you imported from a URL or from a file,
6- file name, table into which you imported, date, and status. By clicking
7- "Show details" you can display details of a particular import.
5+ import jobs. You can see whether you imported from a URL or from a file,
6+ the source file name and the target table name, and other metadata
7+ like date and status.
8+ By navigating to "Show details", you can display details of a particular
9+ import job.
810
9- Clicking the "Import new data" button will bring up the Import page,
11+ Clicking the "Import new data" button will bring up the page
1012where you can select the source of your data.
1113
1214If you don't have a dataset prepared, we also provide an example in the
1315URL import section. It's the New York City taxi trip dataset for July
1416of 2019 (about 6.3M records).
1517
1618(cluster-import-url)=
17- ## Import from URL
19+ ## URL
1820
1921To import data, fill out the URL, name of the table which will be
2022created and populated with your data, data format, and whether it is
@@ -34,7 +36,7 @@ Gzip compressed files are also supported.
3436![ Cloud Console cluster upload from URL] ( ../_assets/img/cluster-import-tab-url.png )
3537
3638(cluster-import-s3)=
37- ## Import from private S3 bucket
39+ ## S3 bucket
3840
3941CrateDB Cloud allows convenient imports directly from S3-compatible
4042storage. To import a file form bucket, provide the name of your bucket,
@@ -48,45 +50,46 @@ imports. The usual file formats are supported - CSV (all variants), JSON
4850![ Cloud Console cluster upload from S3] ( ../_assets/img/cluster-import-tab-s3.png )
4951
5052:::{note}
51- It's important to make sure that you have the right permissions to
53+ It is important to make sure that you have the right permissions to
5254access objects in the specified bucket. For AWS S3, your user should
5355have a policy that allows GetObject access, for example:
5456
55- :::{code}
56- {
57- "Version": "2012-10-17",
58- "Statement": [
59- {
60- "Sid": "AllowGetObject",
61- "Effect": "Allow",
62- "Principal": {
63- "AWS": "* "
64- },
65- "Action": "s3: GetObject ",
66- "Resource": "arn:aws:s3:::EXAMPLE-BUCKET-NAME/* "
67- }]
68- }
69- :::
57+ ``` json
58+ {
59+ "Version" : " 2012-10-17" ,
60+ "Statement" : [
61+ {
62+ "Sid" : " AllowGetObject" ,
63+ "Effect" : " Allow" ,
64+ "Principal" : {
65+ "AWS" : " *"
66+ },
67+ "Action" : " s3:GetObject" ,
68+ "Resource" : " arn:aws:s3:::EXAMPLE-BUCKET-NAME/*"
69+ }
70+ ]
71+ }
72+ ```
7073:::
7174
7275(cluster-import-azure)=
73- ## Import from Azure Blob Storage Container
76+ ## Azure Blob Storage
7477
7578Importing data from private Azure Blob Storage containers is possible
7679using a stored secret, which includes a secret name and either an Azure
7780Storage Connection string or an Azure SAS Token URL. An admin user at
7881the organization level can add this secret.
7982
8083You can specify a secret, a container, a table and a path in the form
81- [ /folder/my_file.parquet]
84+ ` /folder/my_file.parquet ` .
8285
8386As with other imports Parquet, CSV, and JSON files are supported. File
8487size limitation for imports is 10 GiB per file.
8588
8689![ Cloud Console cluster upload from Azure Storage Container] ( ../_assets/img/cluster-import-tab-azure.png )
8790
8891(cluster-import-globbing)=
89- ## Importing multiple files
92+ ## Globbing
9093
9194Importing multiple files, also known as import globbing is supported in
9295any s3-compatible blob storage. The steps are the same as if importing
@@ -112,7 +115,7 @@ As with other imports, the supported file types are CSV, JSON, and
112115Parquet.
113116
114117(cluster-import-file)=
115- ## Import from file
118+ ## File
116119
117120Uploading directly from your computer offers more control over your
118121data. From the security point of view, you don't have to share the data
0 commit comments