Skip to content

Commit 340b531

Browse files
committed
chore: update js-lint to 0.2.11 and delete .prettierrc since its no longer needed
1 parent 3b02c97 commit 340b531

File tree

4 files changed

+95
-45
lines changed

4 files changed

+95
-45
lines changed

.prettierrc

Lines changed: 0 additions & 7 deletions
This file was deleted.

README.md

Lines changed: 90 additions & 33 deletions
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,8 @@ Any of the callers or handlers can be added from the below `Call Types` section.
8888

8989
#### Unary
9090

91-
In Unary calls, the client sends a single request to the server and receives a single response back, much like a regular async function call.
91+
In Unary calls, the client sends a single request to the server and receives a
92+
single response back, much like a regular async function call.
9293

9394
##### Handler
9495

@@ -125,7 +126,9 @@ const squaredNumber = new UnaryCaller<
125126

126127
##### Call-Site
127128

128-
The client initiates a unary RPC call by invoking a method that returns a promise. It passes the required input parameters as arguments to the method. The client then waits for the promise to resolve, receiving the output.
129+
The client initiates a unary RPC call by invoking a method that returns a
130+
promise. It passes the required input parameters as arguments to the method. The
131+
client then waits for the promise to resolve, receiving the output.
129132

130133
```ts
131134
await rpcClient.methods.squaredNumber({ value: 3 });
@@ -134,11 +137,18 @@ await rpcClient.methods.squaredNumber({ value: 3 });
134137

135138
#### Client Streaming
136139

137-
In Client Streaming calls, the client can write multiple messages to a single stream, while the server reads from that stream and then returns a single response. This pattern is useful when the client needs to send a sequence of data to the server, after which the server processes the data and replies with a single result. This pattern is good for scenarios like file uploads.
140+
In Client Streaming calls, the client can write multiple messages to a single
141+
stream, while the server reads from that stream and then returns a single
142+
response. This pattern is useful when the client needs to send a sequence of
143+
data to the server, after which the server processes the data and replies with a
144+
single result. This pattern is good for scenarios like file uploads.
138145

139146
##### Handler
140147

141-
On the server side, the handle function is an asynchronous function that takes an AsyncIterableIterator as input, representing the stream of incoming messages from the client. It returns a promise that resolves to the output that will be sent back to the client.
148+
On the server side, the handle function is an asynchronous function that takes
149+
an AsyncIterableIterator as input, representing the stream of incoming messages
150+
from the client. It returns a promise that resolves to the output that will be
151+
sent back to the client.
142152

143153
```ts
144154
import type { JSONRPCParams, JSONRPCResult, JSONValue } from '@matrixai/rpc';
@@ -177,7 +187,9 @@ const accumulate = new ClientCaller<
177187

178188
##### Call-Site
179189

180-
The client initiates a client streaming RPC call using a method that returns a writable stream and a promise. The client writes to the writable stream and awaits the output promise to get the response.
190+
The client initiates a client streaming RPC call using a method that returns a
191+
writable stream and a promise. The client writes to the writable stream and
192+
awaits the output promise to get the response.
181193

182194
```ts
183195
const { output, writable } = await rpcClient.methods.accumulate();
@@ -193,16 +205,19 @@ await output;
193205

194206
#### Server Streaming
195207

196-
In Server Streaming calls,
197-
the client sends a single request and receives multiple responses in a read-only stream from the server.
198-
The server can keep pushing messages as long as it needs, allowing real-time updates from the server to the client.
199-
This is useful for things like monitoring,
200-
where the server needs to update the client in real-time based on events or data changes.
201-
In this example, the client sends a number and the server responds with the squares of all numbers up to that number.
208+
In Server Streaming calls, the client sends a single request and receives
209+
multiple responses in a read-only stream from the server. The server can keep
210+
pushing messages as long as it needs, allowing real-time updates from the server
211+
to the client. This is useful for things like monitoring, where the server needs
212+
to update the client in real-time based on events or data changes. In this
213+
example, the client sends a number and the server responds with the squares of
214+
all numbers up to that number.
202215

203216
##### Handler
204217

205-
On the server side, the handle function is an asynchronous generator function that takes a single input parameter from the client. It yields multiple messages that will be sent back to the client through the readable stream.
218+
On the server side, the handle function is an asynchronous generator function
219+
that takes a single input parameter from the client. It yields multiple messages
220+
that will be sent back to the client through the readable stream.
206221

207222
```ts
208223
import type { JSONRPCParams, JSONRPCResult, JSONValue } from '@matrixai/rpc';
@@ -236,7 +251,9 @@ const count = new ServerCaller<CallerTypes['input'], CallerTypes['output']>();
236251

237252
##### Call-Site
238253

239-
The client initiates a server streaming RPC call using a method that takes input parameters and returns a readable stream. The client writes a single message and then reads multiple messages from the readable stream.
254+
The client initiates a server streaming RPC call using a method that takes input
255+
parameters and returns a readable stream. The client writes a single message and
256+
then reads multiple messages from the readable stream.
240257

241258
```ts
242259
const callerInterface = await rpcClient.methods.count({ value: 5 });
@@ -251,7 +268,10 @@ while (true) {
251268

252269
#### Duplex Stream
253270

254-
A Duplex Stream enables both the client and the server to read and write messages in their respective streams independently of each other. Both parties can read and write multiple messages in any order. It's useful in scenarios that require ongoing communication in both directions, like chat applications.
271+
A Duplex Stream enables both the client and the server to read and write
272+
messages in their respective streams independently of each other. Both parties
273+
can read and write multiple messages in any order. It's useful in scenarios that
274+
require ongoing communication in both directions, like chat applications.
255275

256276
##### Handler
257277

@@ -287,7 +307,9 @@ const echo = new ServerCaller<CallerTypes['input'], CallerTypes['output']>();
287307

288308
##### Call-Site
289309

290-
The client initiates a duplex streaming RPC call using a method that returns both a readable and a writable stream. The client can read from the readable stream and write to the writable stream.
310+
The client initiates a duplex streaming RPC call using a method that returns
311+
both a readable and a writable stream. The client can read from the readable
312+
stream and write to the writable stream.
291313

292314
```ts
293315
// Initialize the duplex call
@@ -308,7 +330,12 @@ const readResult = await reader.read();
308330

309331
#### Raw Streams
310332

311-
Raw Streams are designed for low-level handling of RPC calls, enabling granular control over data streaming. Unlike other patterns, Raw Streams allow both the server and client to work directly with raw data, providing a more flexible yet complex way to handle communications. This is especially useful when the RPC protocol itself needs customization or when handling different types of data streams within the same connection.
333+
Raw Streams are designed for low-level handling of RPC calls, enabling granular
334+
control over data streaming. Unlike other patterns, Raw Streams allow both the
335+
server and client to work directly with raw data, providing a more flexible yet
336+
complex way to handle communications. This is especially useful when the RPC
337+
protocol itself needs customization or when handling different types of data
338+
streams within the same connection.
312339

313340
##### Handler
314341

@@ -400,25 +427,40 @@ while (true) {
400427

401428
### Timeouts
402429

403-
Whenever the time between the initial message and the following subsequent message of an RPC call exceeds a defined timeout time, the RPC call will have timed out.
430+
Whenever the time between the initial message and the following subsequent
431+
message of an RPC call exceeds a defined timeout time, the RPC call will have
432+
timed out.
404433

405-
For Unary calls, this is similar to the timeout of a response after sending a request.
434+
For Unary calls, this is similar to the timeout of a response after sending a
435+
request.
406436

407-
If the client were to time out, the stream is forcibly closed and `ErrorRPCTimedOut` is thrown from the call.
437+
If the client were to time out, the stream is forcibly closed and
438+
`ErrorRPCTimedOut` is thrown from the call.
408439

409-
If the server were to time out, is is advisory. Meaning that the server may choose to optionally eagerly throw `ErrorRPCTimedOut`, or continue processing as normal.
440+
If the server were to time out, is is advisory. Meaning that the server may
441+
choose to optionally eagerly throw `ErrorRPCTimedOut`, or continue processing as
442+
normal.
410443

411-
After the client receives the subsequent message from the server, the timeout timer is cancelled.
444+
After the client receives the subsequent message from the server, the timeout
445+
timer is cancelled.
412446

413-
Likewise on the server, the timeout timer is cancelled after the first message is sent to the client.
447+
Likewise on the server, the timeout timer is cancelled after the first message
448+
is sent to the client.
414449

415-
This means that the timeout for Streaming calls acts as a Proof of Life, and after it is established, the timeout no longer applies. This allows for long-running Streaming calls.
450+
This means that the timeout for Streaming calls acts as a Proof of Life, and
451+
after it is established, the timeout no longer applies. This allows for
452+
long-running Streaming calls.
416453

417-
Note that when supplying a `Timer` instance to the call-site in `RPCClient`, the timeout timer will not be cancelled. As it is expected for the library to not mutate the passed-in `Timer`, and for the user to expect that receiving a messsage will have meaned that the timer no longer matters.
454+
Note that when supplying a `Timer` instance to the call-site in `RPCClient`, the
455+
timeout timer will not be cancelled. As it is expected for the library to not
456+
mutate the passed-in `Timer`, and for the user to expect that receiving a
457+
messsage will have meaned that the timer no longer matters.
418458

419459
#### Throwing Timeouts Server-Side
420460

421-
By default, a timeout will not cause an RPC call to automatically throw, this must be manually done by the handler when it receives the abort signal from `ctx.signal`. An example of this is like so:
461+
By default, a timeout will not cause an RPC call to automatically throw, this
462+
must be manually done by the handler when it receives the abort signal from
463+
`ctx.signal`. An example of this is like so:
422464

423465
```ts
424466
class TestMethod extends UnaryHandler {
@@ -440,9 +482,12 @@ class TestMethod extends UnaryHandler {
440482

441483
#### Priority of Timeout Options
442484

443-
A `timeoutTime` can be passed both to the constructors of `RPCServer` and `RPCClient`. This is the default `timeoutTime` for all callers/handlers.
485+
A `timeoutTime` can be passed both to the constructors of `RPCServer` and
486+
`RPCClient`. This is the default `timeoutTime` for all callers/handlers.
444487

445-
In the case of `RPCServer`, a `timeout` can be specified when extending any `Handler` class. This will override the default `timeoutTime` set on `RPCServer` for that handler only.
488+
In the case of `RPCServer`, a `timeout` can be specified when extending any
489+
`Handler` class. This will override the default `timeoutTime` set on `RPCServer`
490+
for that handler only.
446491

447492
```ts
448493
class TestMethodArbitraryTimeout extends UnaryHandler {
@@ -458,28 +503,40 @@ class TestMethodArbitraryTimeout extends UnaryHandler {
458503
}
459504
```
460505

461-
In the case of `RPCClient`, a `ctx` with the property `timer` can be supplied with a `Timer` instance or `number` when making making an RPC call. This will override the default `timeoutTime` set on `RPCClient` for that call only.
506+
In the case of `RPCClient`, a `ctx` with the property `timer` can be supplied
507+
with a `Timer` instance or `number` when making making an RPC call. This will
508+
override the default `timeoutTime` set on `RPCClient` for that call only.
462509

463510
```ts
464511
await rpcClient.methods.testMethod({}, { timer: 100 });
465512
await rpcClient.methods.testMethod({}, { timer: new Timer(undefined, 100) });
466513
```
467514

468-
However, it's important to note that any of these timeouts may ultimately be overridden by the shortest timeout of the server and client combined using the timeout middleware below.
515+
However, it's important to note that any of these timeouts may ultimately be
516+
overridden by the shortest timeout of the server and client combined using the
517+
timeout middleware below.
469518

470519
#### Timeout Middleware
471520

472-
The `timeoutMiddleware` sets an RPCServer's timeout based on the lowest timeout between the Client and the Server. This is so that handlers can eagerly time out and stop processing as soon as it is known that the client has timed out.
521+
The `timeoutMiddleware` sets an RPCServer's timeout based on the lowest timeout
522+
between the Client and the Server. This is so that handlers can eagerly time out
523+
and stop processing as soon as it is known that the client has timed out.
473524

474-
This case can be seen in the first diagram, where the server is able to stop the processing of the handler, and close the associated stream of the RPC call based on the shorter timeout sent by the client:
525+
This case can be seen in the first diagram, where the server is able to stop the
526+
processing of the handler, and close the associated stream of the RPC call based
527+
on the shorter timeout sent by the client:
475528

476529
![RPCServer sets timeout based on RPCClient](images/timeoutMiddlewareClientTimeout.svg)
477530

478-
Where the `RPCClient` sends a timeout that is longer than that set on the `RPCServer`, it will be rejected. This is as the timeout of the client should never be expected to exceed that of the server, so that the server's timeout is an absolute limit.
531+
Where the `RPCClient` sends a timeout that is longer than that set on the
532+
`RPCServer`, it will be rejected. This is as the timeout of the client should
533+
never be expected to exceed that of the server, so that the server's timeout is
534+
an absolute limit.
479535

480536
![RPCServer rejects longer timeout sent by RPCClient](images/timeoutMiddlewareServerTimeout.svg)
481537

482-
The `timeoutMiddleware` is enabled by default, and uses the `.metadata.timeout` property on a JSON-RPC request object for the client to send it's timeout.
538+
The `timeoutMiddleware` is enabled by default, and uses the `.metadata.timeout`
539+
property on a JSON-RPC request object for the client to send it's timeout.
483540

484541
## Development
485542

package-lock.json

Lines changed: 4 additions & 4 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
"@streamparser/json": "^0.0.17"
5959
},
6060
"devDependencies": {
61-
"@matrixai/lint": "^0.2.6",
61+
"@matrixai/lint": "^0.2.11",
6262
"@fast-check/jest": "^2.1.0",
6363
"@swc/core": "1.3.82",
6464
"@swc/jest": "^0.2.29",

0 commit comments

Comments
 (0)