Skip to content

Commit 4c369e0

Browse files
authored
Merge pull request #17 from glideapps/api-update-glide-pr-29431
OpenAPI spec update from glideapps/glide#29431
2 parents 4174fd7 + 05b6569 commit 4c369e0

File tree

6 files changed

+88
-10
lines changed

6 files changed

+88
-10
lines changed

api-reference/v2/resources/changelog.mdx

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,10 @@ title: Glide API Changelog
33
sidebarTitle: Changelog
44
---
55

6+
### September 18, 2024
7+
8+
- Endpoints that receive tabular data can now accept CSV and TSV request bodies.
9+
610
### September 13, 2024
711

812
- Introduced a new "Limits" document that outlines rate and operational limits for the API.

api-reference/v2/stashing/put-stashes-serial.mdx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@ openapi: put /stashes/{stashID}/{serial}
55

66
When using large datasets with the Glide API, it may be necessary to break them into smaller chunks for performance and reliability. We call this process "stashing."
77

8+
Tabular data may be stashed in JSON, CSV, or TSV format.
9+
810
<Tip>
911
To learn more about stashing and how to use it to work with large datasets, please see our [introduction to stashing](/api-reference/v2/stashing/introduction).
1012
</Tip>

api-reference/v2/tables/post-table-rows.mdx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@ openapi: post /tables/{tableID}/rows
55

66
Add row data to an existing Big Table.
77

8+
Row data may be passed in JSON, CSV, or TSV format.
9+
810
## Examples
911

1012
<AccordionGroup>

api-reference/v2/tables/post-tables.mdx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@ openapi: post /tables
55

66
Create a new Big Table, define its structure, and (optionally) populate it with data.
77

8+
When using a CSV or TSV request body, the name of the table must be passed as a query parameter and the schema of the table is inferred from the content. Alternatively, the CSV/TSV content may be [stashed](/api-reference/v2/stashing/introduction), and then the schema and name may be passed in the regular JSON payload.
9+
810
## Examples
911

1012
<AccordionGroup>

api-reference/v2/tables/put-tables.mdx

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@ openapi: put /tables/{tableID}
55

66
Overwrite an existing Big Table by clearing all rows and adding new data. You can also update the table schema.
77

8+
When using a CSV or TSV request body, you cannot pass a schema. The current schema will be used. If you need to update the schema, use the `onSchemaError=updateSchema` query parameter, or [stash](/api-reference/v2/stashing/introduction) the CSV/TSV data and pass a JSON request body.
9+
810
<Warning>
911
This is a destructive operation that cannot be undone.
1012
</Warning>

openapi/swagger.json

Lines changed: 76 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -215,8 +215,17 @@
215215
}
216216
}
217217
},
218-
"description": "Creates a new Big Table",
219218
"parameters": [
219+
{
220+
"name": "name",
221+
"in": "query",
222+
"schema": {
223+
"type": "string",
224+
"description": "Name of the table. Required when the name is not passed in the request body. It is an error to pass a name in both this query parameter and the request body.",
225+
"example": "Invoices"
226+
},
227+
"required": false
228+
},
220229
{
221230
"name": "onSchemaError",
222231
"in": "query",
@@ -386,9 +395,24 @@
386395
],
387396
"additionalProperties": false
388397
}
398+
},
399+
"text/csv": {
400+
"schema": {
401+
"type": "string",
402+
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data. The schema will be inferred from the data. The name of the table must be passed in the query parameter `name`.",
403+
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
404+
}
405+
},
406+
"text/tab-separated-values": {
407+
"schema": {
408+
"type": "string",
409+
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data. The schema will be inferred from the data. The name of the table must be passed in the query parameter `name`.",
410+
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
411+
}
389412
}
390413
}
391-
}
414+
},
415+
"description": "Creates a new Big Table"
392416
}
393417
},
394418
"/tables/{tableID}": {
@@ -537,7 +561,6 @@
537561
}
538562
}
539563
},
540-
"description": "Overwrites a Big Table with new schema and/or data",
541564
"parameters": [
542565
{
543566
"name": "tableID",
@@ -712,9 +735,24 @@
712735
],
713736
"additionalProperties": false
714737
}
738+
},
739+
"text/csv": {
740+
"schema": {
741+
"type": "string",
742+
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data.",
743+
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
744+
}
745+
},
746+
"text/tab-separated-values": {
747+
"schema": {
748+
"type": "string",
749+
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data.",
750+
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
751+
}
715752
}
716753
}
717-
}
754+
},
755+
"description": "Replaces the schema and/or data of a Big Table"
718756
}
719757
},
720758
"/tables/{tableID}/rows": {
@@ -862,7 +900,6 @@
862900
}
863901
}
864902
},
865-
"description": "Adds rows to a Big Table",
866903
"parameters": [
867904
{
868905
"name": "tableID",
@@ -935,9 +972,24 @@
935972
}
936973
]
937974
}
975+
},
976+
"text/csv": {
977+
"schema": {
978+
"type": "string",
979+
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data.",
980+
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
981+
}
982+
},
983+
"text/tab-separated-values": {
984+
"schema": {
985+
"type": "string",
986+
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data.",
987+
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
988+
}
938989
}
939990
}
940-
}
991+
},
992+
"description": "Adds rows to a Big Table"
941993
}
942994
},
943995
"/stashes/{stashID}/{serial}": {
@@ -989,7 +1041,6 @@
9891041
}
9901042
}
9911043
},
992-
"description": "Sets the content of a chunk of data inside a stash",
9931044
"parameters": [
9941045
{
9951046
"name": "stashID",
@@ -1039,9 +1090,24 @@
10391090
}
10401091
]
10411092
}
1093+
},
1094+
"text/csv": {
1095+
"schema": {
1096+
"type": "string",
1097+
"description": "A CSV string. The first line is column IDs, and each subsequent line is a row of data.",
1098+
"example": "Name,Age,Birthday\nAlice,25,2024-08-29T09:46:16.722Z\nBob,30,2020-01-15T09:00:16.722Z"
1099+
}
1100+
},
1101+
"text/tab-separated-values": {
1102+
"schema": {
1103+
"type": "string",
1104+
"description": "A TSV string. The first line is column IDs, and each subsequent line is a row of data.",
1105+
"example": "Name\tAge\tBirthday\nAlice\t25\t2024-08-29T09:46:16.722Z\nBob\t30\t2020-01-15T09:00:16.722Z"
1106+
}
10421107
}
10431108
}
1044-
}
1109+
},
1110+
"description": "Sets the content of a chunk of data inside a stash"
10451111
}
10461112
},
10471113
"/stashes/{stashID}": {
@@ -1093,7 +1159,6 @@
10931159
}
10941160
}
10951161
},
1096-
"description": "Deletes a stash and all its data",
10971162
"parameters": [
10981163
{
10991164
"name": "stashID",
@@ -1106,7 +1171,8 @@
11061171
},
11071172
"required": true
11081173
}
1109-
]
1174+
],
1175+
"description": "Deletes a stash and all its data"
11101176
}
11111177
}
11121178
},

0 commit comments

Comments
 (0)