Skip to content

Commit 962e82b

Browse files
committed
edfs
1 parent 639f6bb commit 962e82b

File tree

1 file changed

+148
-3
lines changed

1 file changed

+148
-3
lines changed

packages/web/docs/src/content/gateway/subscriptions.mdx

Lines changed: 148 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -789,6 +789,149 @@ export const network = Network.create(fetchQuery, subscribe)
789789

790790
</Tabs>
791791

792+
793+
## Event-Driven Federated Subscriptions (EDFS)
794+
795+
Hive Gateway supports event-driven federated subscriptions, allowing you to publish events to a
796+
message broker (NATS, Kafka, Redis, etc.) and have those events automatically routed to the
797+
appropriate Hive Gateway subscribers.
798+
799+
If you do not know what Event-Driven Federated Subscriptions (EDFS) are, please refer to
800+
[this great article by Wundergraph](https://wundergraph.com/blog/distributed_graphql_subscriptions_with_nats_and_event_driven_architecture).
801+
802+
Lets go over how you would set up EDFS with [Mesh Compose](https://the-guild.dev/graphql/mesh) and
803+
Hive Gateway using Redis as the message broker.
804+
805+
### Composing the Supergraph With Mesh Compose
806+
807+
Lets compose our supergraph with [Mesh Compose](https://the-guild.dev/graphql/mesh) and add the
808+
subscription fields to the schema.
809+
810+
First we need to make sure we have a "products" subgraph ready and running on
811+
`http://localhost:3000/graphql`:
812+
813+
```gql filename="products.graphql"
814+
type Query {
815+
hello: String!
816+
}
817+
type Product @key(fields: "id") {
818+
id: ID!
819+
name: String!
820+
price: Float!
821+
}
822+
```
823+
824+
And then we need to add the subscription fields like this:
825+
826+
```ts filename="mesh.config.ts"
827+
import { defineConfig, loadGraphQLHTTPSubgraph } from '@graphql-mesh/compose-cli'
828+
829+
export const composeConfig = defineConfig({
830+
subgraphs: [
831+
{
832+
sourceHandler: loadGraphQLHTTPSubgraph('products', {
833+
endpoint: `http://localhost:3000/graphql`
834+
})
835+
}
836+
],
837+
additionalTypeDefs: /* GraphQL */ `
838+
extend schema {
839+
subscription: Subscription
840+
}
841+
type Subscription {
842+
newProduct: Product! @resolveTo(pubsubTopic: "new_product", sourceName: "products")
843+
}
844+
`
845+
})
846+
```
847+
848+
The composed supergraph schema will now contain a `newProduct` subscription field that will have the
849+
gateway subscribe to the `new_product` topic. This is done by the `@resolveTo` directive, the
850+
`pubsubTopic` argument specifies the topic to subscribe to, and the `sourceName` argument specifies
851+
the subgraph that owns the `Product` type, i.e. the subgraph to use to resolve missing fields.
852+
853+
### Configuring Hive Gateway With Redis PubSub
854+
855+
Next step is to configure Hive Gateway to use Redis PubSub as the message broker and consume the
856+
Mesh Compose generated supergraph. This is how the configuration would look like:
857+
858+
Redis PubSub does not come with Hive Gateway, you have to install the package and the Redis PubSub
859+
peer dependency of `ioredis` which you need to install first:
860+
861+
```sh npm2yarn
862+
npm i @graphql-hive/pubsub ioredis
863+
```
864+
865+
```ts filename="gateway.config.ts"
866+
import Redis from 'ioredis'
867+
import { defineConfig } from '@graphql-hive/gateway'
868+
import { RedisPubSub } from '@graphql-hive/pubsub/redis'
869+
870+
/**
871+
* When a Redis connection enters "subscriber mode" (after calling SUBSCRIBE), it can only execute
872+
* subscriber commands (SUBSCRIBE, UNSUBSCRIBE, etc.). Meaning, it cannot execute other commands like PUBLISH.
873+
* To avoid this, we use two separate Redis clients: one for publishing and one for subscribing.
874+
*/
875+
const pub = new Redis()
876+
const sub = new Redis()
877+
878+
export const gatewayConfig = defineConfig({
879+
supergraph: 'supergraph.graphql', // the supergraph generated by Mesh Compose
880+
pubsub: new RedisPubSub(
881+
{ pub, sub },
882+
{
883+
// we make sure to use the same prefix for all gateways to share the same channels and pubsub
884+
// meaning, all gateways using this channel prefix will receive and publish to the same topics
885+
channelPrefix: 'edfs'
886+
}
887+
)
888+
})
889+
```
890+
891+
### Subscribing and Publishing Events
892+
893+
We're now ready to subscribe to the `newProduct` subscription field and publish events to the
894+
`new_product` topic. The publishing of events can happen from **anywhere**, it doesn't have to be
895+
from within Hive Gateway or any perticular subgraph, you can, for example, implement a separate
896+
service that is only responsible for emitting subscription events.
897+
898+
You can subscribe to the `newProduct` subscription from a client using any of the
899+
[transports supported by Hive Gateway](#configure-client-subscriptions), lets subscribe with this
900+
query:
901+
902+
```graphql
903+
subscription {
904+
newProduct {
905+
name
906+
price
907+
}
908+
}
909+
```
910+
911+
and then emit an event to the Redis instance on the `new_product` topic with the `edfs` prefix like
912+
this:
913+
914+
```redis
915+
PUBLISH edfs:new_product '{"id":"roomba70x"}'
916+
```
917+
918+
The subscriber will then receive the following event:
919+
920+
```json
921+
{
922+
"data": {
923+
"newProduct": {
924+
"name": "Roomba 70x",
925+
"price": 279.99
926+
}
927+
}
928+
}
929+
```
930+
931+
Note that the event payload only contains the `id` field, which is the only required field to
932+
resolve the `Product` type. Hive Gateway will then fetch the missing fields from the "products"
933+
subgraph.
934+
792935
## PubSub
793936

794937
Hive Gateway internally uses a PubSub system to handle subscriptions. By default, an in-memory
@@ -873,9 +1016,11 @@ In case you have distributed instances of Hive Gateway, using a distributed PubS
8731016
required to make sure all instances are aware of all active subscriptions and can publish events to
8741017
the correct subscribers.
8751018

876-
For example, when using the webhooks transport for subscriptions, the subgraph will send events to
877-
only one instance of Hive Gateway. If that instance doesn't have any active subscription for the
878-
topic, the event will be lost. Using a distributed PubSub engine solves this problem.
1019+
For example, when using the
1020+
[webhooks transport for subscriptions](https://the-guild.dev/graphql/mesh/v1/subscriptions-webhooks),
1021+
the subgraph will send events to only one instance of Hive Gateway. If that instance doesn't have
1022+
any active subscription for the topic, the event will be lost. Using a distributed PubSub engine
1023+
solves this problem.
8791024

8801025
Redis PubSub does not come with Hive Gateway, you have to install the package and the Redis PubSub
8811026
peer dependency of `ioredis` which you need to install first:

0 commit comments

Comments
 (0)