Skip to content

Commit 63375ca

Browse files
committed
move data-management.md content to bulk-operations.md
1 parent c2d70eb commit 63375ca

File tree

4 files changed

+99
-100
lines changed

4 files changed

+99
-100
lines changed

docs/SUMMARY.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,6 @@
3232
* [Jobs](developers/operations-api/jobs.md)
3333
* [Logs](developers/operations-api/logs.md)
3434
* [System Operations](developers/operations-api/system-operations.md)
35-
* [Data Management](developers/operations-api/data-management.md)
3635
* [Configuration](developers/operations-api/configuration.md)
3736
* [Certificate Management](developers/operations-api/certificate-management.md)
3837
* [Token Authentication](developers/operations-api/token-authentication.md)

docs/developers/operations-api/README.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -27,7 +27,6 @@ The operations API reference is available below and categorized by topic:
2727
* [Jobs](jobs.md)
2828
* [Logs](logs.md)
2929
* [System Operations](system-operations.md)
30-
* [Data Management](data-management.md)
3130
* [Configuration](configuration.md)
3231
* [Certificate Management](certificate-management.md)
3332
* [Token Authentication](token-authentication.md)

docs/developers/operations-api/bulk-operations.md

Lines changed: 99 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,36 @@
11
# Bulk Operations
22

3+
## Export Local
4+
Exports data based on a given search operation to a local file in JSON or CSV format.
5+
6+
* operation _(required)_ - must always be `export_local`
7+
* format _(required)_ - the format you wish to export the data, options are `json` & `csv`
8+
* path _(required)_ - path local to the server to export the data
9+
* search_operation _(required)_ - search_operation of `search_by_hash`, `search_by_value`, `search_by_conditions` or `sql`
10+
* filename _(optional)_ - the name of the file where your export will be written to (do not include extension in filename). If one is not provided it will be autogenerated based on the epoch.
11+
12+
### Body
13+
```json
14+
{
15+
"operation": "export_local",
16+
"format": "json",
17+
"path": "/data/",
18+
"search_operation": {
19+
"operation": "sql",
20+
"sql": "SELECT * FROM dev.breed"
21+
}
22+
}
23+
```
24+
25+
### Response: 200
26+
```json
27+
{
28+
"message": "Starting job with id 6fc18eaa-3504-4374-815c-44840a12e7e5"
29+
}
30+
```
31+
32+
---
33+
334
## CSV Data Load
435
Ingests CSV data, provided directly in the operation as an `insert`, `update` or `upsert` into the specified database table.
536

@@ -92,6 +123,43 @@ Ingests CSV data, provided via URL, as an `insert`, `update` or `upsert` into th
92123

93124
---
94125

126+
## Export To S3
127+
Exports data based on a given search operation from table to AWS S3 in JSON or CSV format.
128+
129+
* operation _(required)_ - must always be `export_to_s3`
130+
* format _(required)_ - the format you wish to export the data, options are `json` & `csv`
131+
* s3 _(required)_ - details your access keys, bucket, bucket region and key for saving the data to S3
132+
* search_operation _(required)_ - search_operation of `search_by_hash`, `search_by_value`, `search_by_conditions` or `sql`
133+
134+
### Body
135+
```json
136+
{
137+
"operation": "export_to_s3",
138+
"format": "json",
139+
"s3": {
140+
"aws_access_key_id": "YOUR_KEY",
141+
"aws_secret_access_key": "YOUR_SECRET_KEY",
142+
"bucket": "BUCKET_NAME",
143+
"key": "OBJECT_NAME",
144+
"region": "BUCKET_REGION"
145+
},
146+
"search_operation": {
147+
"operation": "sql",
148+
"sql": "SELECT * FROM dev.dog"
149+
}
150+
}
151+
```
152+
153+
### Response: 200
154+
```json
155+
{
156+
"message": "Starting job with id 9fa85968-4cb1-4008-976e-506c4b13fc4a",
157+
"job_id": "9fa85968-4cb1-4008-976e-506c4b13fc4a"
158+
}
159+
```
160+
161+
---
162+
95163
## Import from S3
96164
This operation allows users to import CSV or JSON files from an AWS S3 bucket as an `insert`, `update` or `upsert`.
97165

@@ -129,4 +197,35 @@ This operation allows users to import CSV or JSON files from an AWS S3 bucket as
129197
"message": "Starting job with id 062a1892-6a0a-4282-9791-0f4c93b12e16",
130198
"job_id": "062a1892-6a0a-4282-9791-0f4c93b12e16"
131199
}
200+
```
201+
202+
---
203+
204+
## Delete Records Before
205+
206+
Delete data before the specified timestamp on the specified database table exclusively on the node where it is executed. Any clustered nodes with replicated data will retain that data.
207+
208+
_Operation is restricted to super_user roles only_
209+
210+
* operation _(required)_ - must always be `delete_records_before`
211+
* date _(required)_ - records older than this date will be deleted. Supported format looks like: `YYYY-MM-DDThh:mm:ss.sZ`
212+
* schema _(required)_ - name of the schema where you are deleting your data
213+
* table _(required)_ - name of the table where you are deleting your data
214+
215+
### Body
216+
```json
217+
{
218+
"operation": "delete_records_before",
219+
"date": "2021-01-25T23:05:27.464",
220+
"schema": "dev",
221+
"table": "breed"
222+
}
223+
```
224+
225+
### Response: 200
226+
```json
227+
{
228+
"message": "Starting job with id d3aed926-e9fe-4ec1-aea7-0fb4451bd373",
229+
"job_id": "d3aed926-e9fe-4ec1-aea7-0fb4451bd373"
230+
}
132231
```

docs/developers/operations-api/data-management.md

Lines changed: 0 additions & 98 deletions
This file was deleted.

0 commit comments

Comments
 (0)