Skip to content

Commit 320dbf7

Browse files
authored
update node readme (#691)
1 parent 8332ae0 commit 320dbf7

File tree

2 files changed

+190
-16
lines changed

2 files changed

+190
-16
lines changed

packages/core/src/analytics/dispatch-emit.ts

Lines changed: 10 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,16 @@
1+
import { CoreContext } from '../context'
12
import { dispatch } from './dispatch'
23

4+
type DispatchAndEmitFn = (
5+
...args: Parameters<typeof dispatch>
6+
) => Promise<CoreContext | undefined>
7+
38
/* Dispatch function, but swallow promise rejections and use event emitter instead */
4-
export const dispatchAndEmit = async (
5-
...[event, queue, emitter, options]: Parameters<typeof dispatch>
9+
export const dispatchAndEmit: DispatchAndEmitFn = async (
10+
event,
11+
queue,
12+
emitter,
13+
options
614
) => {
715
try {
816
const ctx = await dispatch(event, queue, emitter, options)

packages/node/README.md

Lines changed: 180 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -22,12 +22,12 @@ import { Analytics } from '@segment/analytics-node'
2222

2323
const analytics = new Analytics({ writeKey: '<MY_WRITE_KEY>' })
2424

25-
2625
app.post('/login', (req, res) => {
2726
analytics.identify({
2827
userId: req.body.userId,
2928
previousId: req.body.previousId
3029
})
30+
res.sendStatus(200)
3131
})
3232

3333
app.post('/cart', (req, res) => {
@@ -36,24 +36,76 @@ app.post('/cart', (req, res) => {
3636
event: 'Add to cart',
3737
properties: { productId: '123456' }
3838
})
39+
res.sendStatus(200)
40+
});
41+
```
42+
## Regional configuration
43+
44+
For Business plans with access to Regional Segment, you can use the host configuration parameter to send data to the desired region:
45+
46+
Oregon (Default) — api.segment.io/v1
47+
Dublin — events.eu1.segmentapis.com
48+
An example of setting the host to the EU endpoint using the Node library would be:
49+
50+
```ts
51+
const analytics = new Analytics('YOUR_WRITE_KEY', {
52+
host: "https://events.eu1.segmentapis.com"
3953
});
4054
```
4155

4256
## Complete Settings / Configuration
4357
See complete list of settings in the [AnalyticsSettings interface](src/app/settings.ts).
4458
```ts
45-
new Analytics({
59+
const analytics = new Analytics({
4660
writeKey: '<MY_WRITE_KEY>',
61+
plugins: [plugin1, plugin2],
4762
host: 'https://api.segment.io',
4863
path: '/v1/batch',
64+
maxRetries: 3,
65+
maxEventsInBatch: 15,
4966
flushInterval: 10000,
50-
plugins: [plugin1, plugin2],
5167
// ... and more!
5268
})
5369

5470
```
5571

56-
## Graceful Shutdown
72+
## Batching
73+
Our libraries are built to support high performance environments. That means it is safe to use our Node library on a web server that’s serving thousands of requests per second.
74+
75+
Every method you call does not result in an HTTP request, but is queued in memory instead. Messages are then flushed in batch in the background, which allows for much faster operation.
76+
77+
By default, our library will flush:
78+
79+
- The very first time it gets a message.
80+
- Every 15 messages (controlled by `settings.maxEventsInBatch`).
81+
- If 10 seconds has passed since the last flush (controlled by `settings.flushInterval`)
82+
83+
There is a maximum of 500KB per batch request and 32KB per call.
84+
85+
If you don’t want to batch messages, you can turn batching off by setting the `maxEventsInBatch` setting to 1, like so:
86+
```ts
87+
const analytics = new Analytics({ '<MY_WRITE_KEY>', { maxEventsInBatch: 1 });
88+
```
89+
Batching means that your message might not get sent right away. But every method call takes an optional callback, which you can use to know when a particular message is flushed from the queue, like so:
90+
91+
```ts
92+
analytics.track({
93+
userId: '019mr8mf4r',
94+
event: 'Ultimate Played'
95+
callback: (ctx) => console.log(ctx)
96+
})
97+
```
98+
## Error Handling
99+
Subscribe and log all event delivery errors.
100+
```ts
101+
const analytics = new Analytics({ writeKey: '<MY_WRITE_KEY>' })
102+
103+
analytics.on('error', (err) => console.error(err))
104+
```
105+
106+
107+
## Graceful Shutdown (Long or short running processes)
108+
57109
### Avoid losing events on exit!
58110
* Call `.closeAndFlush()` to stop collecting new events and flush all existing events.
59111
* If a callback on an event call is included, this also waits for all callbacks to be called, and any of their subsequent promises to be resolved.
@@ -70,25 +122,28 @@ import express from 'express'
70122
const analytics = new Analytics({ writeKey: '<MY_WRITE_KEY>' })
71123

72124
const app = express()
125+
73126
app.post('/cart', (req, res) => {
74127
analytics.track({
75128
userId: req.body.userId,
76129
event: 'Add to cart',
77130
properties: { productId: '123456' }
78131
})
79-
});
132+
res.sendStatus(200)
133+
})
80134

81135
const server = app.listen(3000)
82136

83-
84137
const onExit = async () => {
85-
console.log("Gracefully closing server...");
86138
await analytics.closeAndFlush() // flush all existing events
87-
server.close(() => process.exit());
88-
};
139+
server.close(() => {
140+
console.log("Gracefully closing server...")
141+
process.exit()
142+
})
143+
}
144+
145+
['SIGINT', 'SIGTERM'].forEach((code) => process.on(code, onExit))
89146

90-
process.on("SIGINT", onExit);
91-
process.on("SIGTERM", onExit);
92147
```
93148

94149
#### Collecting unflushed events
@@ -104,11 +159,122 @@ console.log(unflushedEvents) // all events that came in after closeAndFlush was
104159
```
105160

106161

107-
## Event Emitter
162+
## Event Emitter Interface
108163
```ts
109-
// subscribe to delivery errors
110-
analytics.on('error', (err) => console.error(err))
164+
// subscribe to identify calls
165+
analytics.on('identify', (err) => console.error(err))
111166

112167
// subscribe to a specific event
113168
analytics.on('track', (ctx) => console.log(ctx))
114169
```
170+
171+
172+
## Multiple Clients
173+
Different parts of your application may require different types of batching, or even sending to multiple Segment sources. In that case, you can initialize multiple instances of Analytics with different settings:
174+
175+
```ts
176+
import { Analytics } from '@segment/analytics-node'
177+
178+
const marketingAnalytics = new Analytics('MARKETING_WRITE_KEY');
179+
const appAnalytics = new Analytics('APP_WRITE_KEY');
180+
```
181+
182+
## Troubleshooting
183+
1. Double check that you’ve followed all the steps in the Quick Start.
184+
185+
2. Make sure that you’re calling a Segment API method once the library is successfully installed: identify, track, etc.
186+
187+
3. Log events and errors the event emitter:
188+
```js
189+
['initialize', 'call_after_close',
190+
'screen', 'identify', 'group',
191+
'track', 'ready', 'alias',
192+
'page', 'error', 'register',
193+
'deregister'].forEach((event) => analytics.on(event, console.log)
194+
```
195+
196+
197+
## Differences from legacy analytics-node / Migration Guide
198+
199+
200+
- Named imports.
201+
```ts
202+
// old
203+
import Analytics from 'analytics-node'
204+
205+
// new
206+
import { Analytics } from '@segment/analytics-next'
207+
```
208+
209+
- Instantiation requires an object
210+
```ts
211+
// old
212+
213+
var analytics = new Analytics('YOUR_WRITE_KEY');
214+
215+
// new
216+
const analytics = new Analytics({ writeKey: 'YOUR_WRITE_KEY' });
217+
218+
```
219+
- Graceful shutdown (See Graceful Shutdown section)
220+
```ts
221+
// old
222+
await analytics.flush(function(err, batch) {
223+
console.log('Flushed, and now this program can exit!');
224+
});
225+
226+
// new
227+
await analytics.closeAndFlush()
228+
```
229+
230+
Other Differences:
231+
232+
- The `enable` configuration option has been removed-- see "Disabling Analytics" section
233+
- the `errorHandler` configuration option has been remove -- see "Error Handling" section
234+
- `flushAt` configuration option -> `maxEventsInBatch`.
235+
- `callback` option is moved to configuration
236+
```ts
237+
// old
238+
analytics.track({
239+
userId: '019mr8mf4r',
240+
event: 'Ultimate Played'
241+
}), function(err, batch){
242+
if (err) {
243+
console.error(err)
244+
}
245+
});
246+
247+
// new
248+
analytics.track({
249+
userId: '019mr8mf4r',
250+
event: 'Ultimate Played',
251+
callback: (ctx) => {
252+
if (ctx.failedDelivery()) {
253+
console.error(ctx)
254+
}
255+
}
256+
})
257+
258+
```
259+
260+
261+
## Development / Disabling Analytics
262+
- If you want to disable analytics for unit tests, you can use something like [nock](https://github.com/nock/nock) or [jest mocks](https://jestjs.io/docs/manual-mocks).
263+
264+
You should prefer mocking. However, if you need to intercept the request, you can do:
265+
266+
```ts
267+
// Note: nock will _not_ work if polyfill fetch with something like undici, as nock uses the http module. Undici has its own interception method.
268+
import nock from 'nock'
269+
270+
const mockApiHost = 'https://foo.bar'
271+
const mockPath = '/foo'
272+
273+
nock(mockApiHost) // using regex matching in nock changes the perf profile quite a bit
274+
.post(mockPath, (body) => true)
275+
.reply(201)
276+
.persist()
277+
278+
const analytics = new Analytics({ host: mockApiHost, path: mockPath })
279+
280+
```

0 commit comments

Comments
 (0)