22# Import
33
44The first thing you see in the "Import" tab is the history of your
5- import jobs. You can see whether you imported from a URL or from a file,
6- the source file name and the target table name, and other metadata
7- like date and status.
8- By navigating to "Show details", you can display details of a particular
9- import job.
5+ imports. You can see whether you imported from a URL or from a file,
6+ file name, table into which you imported, date, and status. By clicking
7+ "Show details" you can display details of a particular import.
108
11- Clicking the "Import new data" button will bring up the page
9+ Clicking the "Import new data" button will bring up the Import page,
1210where you can select the source of your data.
1311
1412If you don't have a dataset prepared, we also provide an example in the
1513URL import section. It's the New York City taxi trip dataset for July
1614of 2019 (about 6.3M records).
1715
1816(cluster-import-url)=
19- ## URL
17+ ## Import from URL
2018
2119To import data, fill out the URL, name of the table which will be
2220created and populated with your data, data format, and whether it is
@@ -36,7 +34,7 @@ Gzip compressed files are also supported.
3634![ Cloud Console cluster upload from URL] ( ../_assets/img/cluster-import-tab-url.png )
3735
3836(cluster-import-s3)=
39- ## S3 bucket
37+ ## Import from private S3 bucket
4038
4139CrateDB Cloud allows convenient imports directly from S3-compatible
4240storage. To import a file form bucket, provide the name of your bucket,
@@ -50,46 +48,45 @@ imports. The usual file formats are supported - CSV (all variants), JSON
5048![ Cloud Console cluster upload from S3] ( ../_assets/img/cluster-import-tab-s3.png )
5149
5250:::{note}
53- It is important to make sure that you have the right permissions to
51+ It's important to make sure that you have the right permissions to
5452access objects in the specified bucket. For AWS S3, your user should
5553have a policy that allows GetObject access, for example:
5654
57- ``` json
58- {
59- "Version" : " 2012-10-17" ,
60- "Statement" : [
61- {
62- "Sid" : " AllowGetObject" ,
63- "Effect" : " Allow" ,
64- "Principal" : {
65- "AWS" : " *"
66- },
67- "Action" : " s3:GetObject" ,
68- "Resource" : " arn:aws:s3:::EXAMPLE-BUCKET-NAME/*"
69- }
70- ]
71- }
72- ```
55+ :::{code}
56+ {
57+ "Version": "2012-10-17",
58+ "Statement": [
59+ {
60+ "Sid": "AllowGetObject",
61+ "Effect": "Allow",
62+ "Principal": {
63+ "AWS": "* "
64+ },
65+ "Action": "s3: GetObject ",
66+ "Resource": "arn:aws:s3:::EXAMPLE-BUCKET-NAME/* "
67+ }]
68+ }
69+ :::
7370:::
7471
7572(cluster-import-azure)=
76- ## Azure Blob Storage
73+ ## Import from Azure Blob Storage Container
7774
7875Importing data from private Azure Blob Storage containers is possible
7976using a stored secret, which includes a secret name and either an Azure
8077Storage Connection string or an Azure SAS Token URL. An admin user at
8178the organization level can add this secret.
8279
8380You can specify a secret, a container, a table and a path in the form
84- ` /folder/my_file.parquet ` .
81+ [ /folder/my_file.parquet]
8582
8683As with other imports Parquet, CSV, and JSON files are supported. File
8784size limitation for imports is 10 GiB per file.
8885
8986![ Cloud Console cluster upload from Azure Storage Container] ( ../_assets/img/cluster-import-tab-azure.png )
9087
9188(cluster-import-globbing)=
92- ## Globbing
89+ ## Importing multiple files
9390
9491Importing multiple files, also known as import globbing is supported in
9592any s3-compatible blob storage. The steps are the same as if importing
@@ -115,7 +112,7 @@ As with other imports, the supported file types are CSV, JSON, and
115112Parquet.
116113
117114(cluster-import-file)=
118- ## File
115+ ## Import from file
119116
120117Uploading directly from your computer offers more control over your
121118data. From the security point of view, you don't have to share the data
0 commit comments