Skip to content

Commit 76e1396

Browse files
authored
docs: add example of readable & writable COPY stream usage (#348)
* docs: add example of readable & writable COPY stream usage * style: remove semicolon * add readable stream async iterator example, use latest nodejs docs links
1 parent 1e2e298 commit 76e1396

File tree

1 file changed

+48
-0
lines changed

1 file changed

+48
-0
lines changed

README.md

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -426,6 +426,54 @@ Using a file for a query is also supported with optional parameters to use if th
426426
const result = await sql.file('query.sql', ['Murray', 68])
427427
```
428428
429+
### Rows as Streams
430+
431+
Postgres.js supports [`copy ...`](https://www.postgresql.org/docs/14/sql-copy.html) queries, which are exposed as [Node.js streams](https://nodejs.org/api/stream.html).
432+
433+
> **NOTE** This is a low-level API which does not provide any type safety. To make this work, you must match your [`copy query` parameters](https://www.postgresql.org/docs/14/sql-copy.html) correctly to your [Node.js stream read or write](https://nodejs.org/api/stream.html) code. Ensure [Node.js stream backpressure](https://nodejs.org/en/docs/guides/backpressuring-in-streams/) is handled correctly to avoid memory exhaustion.
434+
435+
#### ```await sql`copy ... from stdin` -> Writable```
436+
437+
```js
438+
const { pipeline } = require('stream/promises')
439+
440+
// Stream of users with the default tab delimitated cells and new-line delimitated rows
441+
const userStream = Readable.from([
442+
'Murray\t68\n',
443+
'Walter\t80\n'
444+
])
445+
446+
const query = await sql`copy users (name, age) from stdin`.writable()
447+
await pipeline(userStream, query);
448+
```
449+
450+
#### ```await sql`copy ... to stdin` -> Readable```
451+
452+
##### stream pipeline
453+
```js
454+
const { pipeline } = require('stream/promises')
455+
const { createWriteStream } = require('fs')
456+
457+
const readableStream = await sql`copy users (name, age) to stdin`.readable()
458+
await pipeline(readableStream, createWriteStream('output.tsv'))
459+
// output.tsv content: `Murray\t68\nWalter\t80\n`
460+
```
461+
462+
##### for await...of
463+
```js
464+
const readableStream = await sql`
465+
copy (
466+
select name, age
467+
from users
468+
where age = 68
469+
) to stdin
470+
`.readable()
471+
for await (const chunk of readableStream) {
472+
// chunk.toString() === `Murray\t68\n`
473+
}
474+
```
475+
476+
429477
### Canceling Queries in Progress
430478
431479
Postgres.js supports, [canceling queries in progress](https://www.postgresql.org/docs/7.1/protocol-protocol.html#AEN39000). It works by opening a new connection with a protocol level startup message to cancel the current query running on a specific connection. That means there is no guarantee that the query will be canceled, and due to the possible race conditions it might even result in canceling another query. This is fine for long running queries, but in the case of high load and fast queries it might be better to simply ignore results instead of canceling.

0 commit comments

Comments
 (0)