Skip to content

Commit 8e3ac4a

Browse files
authored
Merge pull request #30 from NYPL/NOREF-node20
Updates for v2.0 Node 20
2 parents 45a1169 + a5e6f70 commit 8e3ac4a

File tree

14 files changed

+8797
-7352
lines changed

14 files changed

+8797
-7352
lines changed

.nvmrc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
14
1+
20

.travis.yml

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
11
language: node_js
2-
node_js:
3-
- '6.11'
2+
install: npm install
3+
script: npm test
4+
dist: jammy

README.md

Lines changed: 26 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -14,30 +14,38 @@ npm i @nypl/nypl-streams-client --save
1414

1515
```js
1616
const NyplStreamsClient = require('@nypl/nypl-Streams-client')
17-
var streamsClient = new NyplStreamsClient({ nyplDataApiClientBase: 'http://example.com/api/v0.1/' })
17+
const streamsClient = new NyplStreamsClient({ nyplDataApiClientBase: 'https://example.com/api/v0.1/' })
1818
```
1919

2020
See [docs/usage.md](docs/usage.md) for complete documentation of Client methods and use.
2121

2222
### Example 1: Writing data to a stream
2323

2424
To write a single record to a stream (encoded to "MyStream" schema):
25+
2526
```js
26-
streamsClient.write('MyStream', { id: 'id1', field1: 1, field2: 2 }).then((resp) => {
27-
console.log('Finished writing to stream ' + resp.Records.length)
28-
}).catch((e) => console.error('Error writing to stream: ', e))
27+
try {
28+
const response = await streamsClient.write('MyStream', { id: 'id1', field1: 1, field2: 2 })
29+
console.log('Finished writing to stream ' + response.Records.length)
30+
} catch (e) {
31+
console.error('Error writing to stream: ', e)
32+
}
2933
```
3034

3135
To write multiple records to a stream, batched and rate-limited to avoid write errors:
36+
3237
```js
33-
var records = [ { id: 'id1', field1: 1, field2: 2 }, { id: 'id2', field1: 1 }, ... ] // Array of any length
34-
var options = {
38+
const records = [ { id: 'id1', field1: 1, field2: 2 }, { id: 'id2', field1: 1 }, ... ] // Array of any length
39+
const options = {
3540
recordsPerSecond: 500 // This is the default and well below the 1000/s AWS constraint
3641
}
37-
streamsClient.write('MyStream', records, options).then((resp) => {
42+
try {
43+
const response = await streamsClient.write('MyStream', records, options)
3844
console.log('Finished writing to stream ' + resp.Records.length)
3945
console.log(`Failed to write: ${resp.FailedRecordCount} record(s)`)
40-
}).catch((e) => console.error('Error writing to stream: ', e))
46+
} catch (e) {
47+
console.error('Error writing to stream: ', e)
48+
}
4149
```
4250

4351
Above will resolve after `records.length / 500` seconds. The resolved value is a hash merged from the hashes returned from each putRecords call.
@@ -49,18 +57,19 @@ The streams client can be used for decoding data obtained directly from a stream
4957
Example lambda handler with a kinesis trigger:
5058

5159
```js
52-
exports.handler = function (event, context, callback) {
60+
exports.handler = async (event, context, callback) => {
5361
// Initialize streams client:
5462
const streamsClient = new NyplStreamsClient({ nyplDataApiClientBase: 'http://example.com/api/v0.1/' })
5563
const record = event.Records[0]
5664

5765
if (record.kinesis) {
58-
const decodedKinesisData = streamsClient.decodeData('SchemaName', event.Records.map(record => record.kinesis.data));
66+
const encoded = event.Records.map(record => record.kinesis.data)
5967

60-
// Resolve the Promise and do something with the decoded data
61-
return decodedKinesisData
62-
.then((result) => console.log('result:', result))
63-
.catch((err) => console.log('rejected:', err));
68+
try {
69+
const decoded = await streamsClient.decodeData('SchemaName', encoded)
70+
} catch (e) => {
71+
console.error('Error decoding event: ', e)
72+
}
6473
}
6574
}
6675
```
@@ -71,13 +80,13 @@ The library includes a CLI for writing arbitary events to streams. Care should b
7180

7281
For example, to write a `SierraBibRetrievalRequest` encoded event to the `SierraBibRetriever-qa` stream:
7382
```
74-
cli/nypl-streams.js --envfile config/qa.env --profile nypl-digital-dev write SierraBibRetriever-qa --schemaName SierraBibRetrievalRequest '{ "id": "21747246" }'
83+
./cli/nypl-streams.js --envfile config/qa.env --profile nypl-digital-dev write SierraBibRetriever-qa --schemaName SierraBibRetrievalRequest '{ "id": "21747246" }'
7584
```
7685

7786
## Git workflow
7887

79-
- Cut feature branch from master.
80-
- Create PR to merge feature branch into master
88+
- Cut feature branch from main.
89+
- Create PR to merge feature branch into main
8190
- After PR approved by multiple co-workers, the author merges the PR.
8291

8392
### Publishing to NPMJS

cli/nypl-streams.js

Lines changed: 6 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@
1212
*/
1313

1414
const dotenv = require('dotenv')
15-
const aws = require('aws-sdk')
15+
const { fromIni } = require('@aws-sdk/credential-providers')
1616

1717
const Client = require('../index')
1818
const argv = require('minimist')(process.argv.slice(2))
@@ -21,20 +21,8 @@ if (!argv.envfile) throw new Error('Must specify --envfile; See config/sample.en
2121

2222
if (!argv.profile) throw new Error('Must specify --profile')
2323

24-
function setProfile (profile) {
25-
// Set aws creds:
26-
aws.config.credentials = new aws.SharedIniFileCredentials({
27-
profile
28-
})
29-
30-
// Set aws region:
31-
let awsSecurity = { region: 'us-east-1' }
32-
aws.config.update(awsSecurity)
33-
}
34-
3524
function writeToStream (streamName, data) {
36-
// Schema name is the stream name minus the env suffix:
37-
const schemaName = argv.schemaName || streamName.replace(/-.*/, '')
25+
const schemaName = argv.schemaName
3826

3927
data = JSON.parse(data)
4028

@@ -48,9 +36,10 @@ function writeToStream (streamName, data) {
4836

4937
dotenv.config({ path: argv.envfile })
5038

51-
setProfile(argv.profile)
52-
53-
const client = new Client({ nyplDataApiClientBase: process.env.NYPL_API_BASE_URL })
39+
const client = new Client({
40+
nyplDataApiClientBase: process.env.NYPL_API_BASE_URL,
41+
awsClientOptions: { credentials: fromIni({ profile: argv.profile }) }
42+
})
5443

5544
switch (argv._[0]) {
5645
case 'write':

docs/usage.md

Lines changed: 36 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -5,19 +5,25 @@
55
<dd></dd>
66
</dl>
77

8+
## Functions
9+
10+
<dl>
11+
<dt><a href="#AvroValidationError">AvroValidationError()</a></dt>
12+
<dd><p>A AvroValidationError is thrown when avsc fails to encode/decode data.</p>
13+
</dd>
14+
</dl>
15+
816
## Typedefs
917

1018
<dl>
1119
<dt><a href="#ClientConstructorOptions">ClientConstructorOptions</a> : <code>Object</code></dt>
1220
<dd></dd>
21+
<dt><a href="#AwsClientOptions">AwsClientOptions</a> : <code>Object</code></dt>
22+
<dd></dd>
1323
<dt><a href="#WriteOptions">WriteOptions</a> : <code>Object</code></dt>
1424
<dd></dd>
1525
<dt><a href="#WriteResponse">WriteResponse</a> : <code>Object</code></dt>
1626
<dd></dd>
17-
<dt><a href="#CreateStreamOptions">CreateStreamOptions</a> : <code>Object</code></dt>
18-
<dd></dd>
19-
<dt><a href="#DeleteStreamOptions">DeleteStreamOptions</a> : <code>Object</code></dt>
20-
<dd></dd>
2127
</dl>
2228

2329
<a name="Client"></a>
@@ -28,14 +34,13 @@
2834
* [Client](#Client)
2935
* [new Client(options)](#new_Client_new)
3036
* [.write(streamName, data, options)](#Client+write) ⇒ <code>Promise.&lt;WriteReponse&gt;</code>
31-
* [.createStream(name, options)](#Client+createStream) ⇒ <code>Promise</code>
32-
* [.deleteStream(name, options)](#Client+deleteStream) ⇒ <code>Promise</code>
3337
* [.kinesisClient()](#Client+kinesisClient) ⇒ <code>Promise.&lt;AWS.Kinesis&gt;</code>
3438
* [.dataApiClient()](#Client+dataApiClient) ⇒ <code>Promise.&lt;NyplClient&gt;</code>
3539
* [.encodeData(schemaName, data)](#Client+encodeData) ⇒ <code>Promise</code>
3640
* [.decodeData(schemaName, data)](#Client+decodeData) ⇒ <code>Promise</code>
3741
* [.decodeAvroBufferString(bufferString, avroObject, encodeType)](#Client+decodeAvroBufferString)
3842
* [.getAvroType()](#Client+getAvroType) ⇒ <code>Promise.&lt;avsc.Type&gt;</code>
43+
* [._defaultSchema()](#Client+_defaultSchema)
3944

4045
<a name="new_Client_new"></a>
4146

@@ -61,32 +66,6 @@ Note, the `data` arg can be an object or array of objects.
6166
| data | <code>Object</code> \| <code>Array</code> | Object (or array of objects) to write. |
6267
| options | [<code>WriteOptions</code>](#WriteOptions) | |
6368

64-
<a name="Client+createStream"></a>
65-
66-
### client.createStream(name, options) ⇒ <code>Promise</code>
67-
Create a stream by name
68-
69-
**Kind**: instance method of [<code>Client</code>](#Client)
70-
**Returns**: <code>Promise</code> - A promise that resolves on success.
71-
72-
| Param | Type | Description |
73-
| --- | --- | --- |
74-
| name | <code>string</code> | Name of stream |
75-
| options | [<code>CreateStreamOptions</code>](#CreateStreamOptions) | |
76-
77-
<a name="Client+deleteStream"></a>
78-
79-
### client.deleteStream(name, options) ⇒ <code>Promise</code>
80-
Delete a stream by name
81-
82-
**Kind**: instance method of [<code>Client</code>](#Client)
83-
**Returns**: <code>Promise</code> - A promise that resolves on success.
84-
85-
| Param | Type | Description |
86-
| --- | --- | --- |
87-
| name | <code>string</code> | Name of stream |
88-
| options | <code>CreateOptions</code> | |
89-
9069
<a name="Client+kinesisClient"></a>
9170

9271
### client.kinesisClient() ⇒ <code>Promise.&lt;AWS.Kinesis&gt;</code>
@@ -145,6 +124,18 @@ Returns an avro type instance by schema name
145124

146125
**Kind**: instance method of [<code>Client</code>](#Client)
147126
**Returns**: <code>Promise.&lt;avsc.Type&gt;</code> - A Promise that resolves an avsc.Type instance
127+
<a name="Client+_defaultSchema"></a>
128+
129+
### client.\_defaultSchema()
130+
Given a stream name (e.g. MyEventStream-qa) returns the conventional schema name (MyEventStream)
131+
132+
**Kind**: instance method of [<code>Client</code>](#Client)
133+
<a name="AvroValidationError"></a>
134+
135+
## AvroValidationError()
136+
A AvroValidationError is thrown when avsc fails to encode/decode data.
137+
138+
**Kind**: global function
148139
<a name="ClientConstructorOptions"></a>
149140

150141
## ClientConstructorOptions : <code>Object</code>
@@ -159,7 +150,18 @@ Returns an avro type instance by schema name
159150
| waitBetweenDescribeCallsInSeconds | <code>number</code> | How many seconds to pause between describe calls (i.e. when waiting for active stream). Default 4 |
160151
| maxDescribeCallRetries | <code>number</code> | Maximum describe calls to make before giving up (i.e. when waiting for active stream). Default 10. |
161152
| logLevel | <code>string</code> | Set [log level](https://github.com/pimterry/loglevel) (i.e. info, error, warn, debug). Default env.LOG_LEVEL or 'error' |
162-
| awsRegion | <code>string</code> | AWS region to use. Default us-east-1 |
153+
| awsClientOptions | [<code>AwsClientOptions</code>](#AwsClientOptions) | AWS client options |
154+
155+
<a name="AwsClientOptions"></a>
156+
157+
## AwsClientOptions : <code>Object</code>
158+
**Kind**: global typedef
159+
**Properties**
160+
161+
| Name | Type | Description |
162+
| --- | --- | --- |
163+
| region | <code>string</code> | AWS region to use. Default us-east-1 |
164+
| profile | <code>string</code> | Named profile to use for from local credentials file. |
163165

164166
<a name="WriteOptions"></a>
165167

@@ -170,7 +172,7 @@ Returns an avro type instance by schema name
170172
| Name | Type | Description |
171173
| --- | --- | --- |
172174
| avroEncode | <code>boolean</code> | Whether or not to Name of avro schema to use to encode. |
173-
| avroSchemaName | <code>string</code> | Name of avro schema to use to encode. Defaults to `streamName`. |
175+
| avroSchemaName | <code>string</code> | Name of avro schema to use to encode. Defaults to `streamName` (with -qa/-production suffix removed). |
174176

175177
<a name="WriteResponse"></a>
176178

@@ -184,24 +186,3 @@ Returns an avro type instance by schema name
184186
| FailedRecordCount | <code>number</code> | Number of records that failed |
185187
| unmergedResponses | <code>Array</code> | Raw AWS responses (for debugging mult. batch jobs) |
186188

187-
<a name="CreateStreamOptions"></a>
188-
189-
## CreateStreamOptions : <code>Object</code>
190-
**Kind**: global typedef
191-
**Properties**
192-
193-
| Name | Type | Default | Description |
194-
| --- | --- | --- | --- |
195-
| shards | <code>number</code> | <code>1</code> | Number of shards to attach to stream |
196-
| failIfExists | <code>boolean</code> | <code>false</code> | Whether to throw error if stream already exists. |
197-
198-
<a name="DeleteStreamOptions"></a>
199-
200-
## DeleteStreamOptions : <code>Object</code>
201-
**Kind**: global typedef
202-
**Properties**
203-
204-
| Name | Type | Default | Description |
205-
| --- | --- | --- | --- |
206-
| yesIKnowThisIsPotentiallyDisastrous | <code>boolean</code> | <code>false</code> | Flag that must be set to true to allow call to succeed. |
207-

0 commit comments

Comments
 (0)