Skip to content

Commit b063822

Browse files
committed
StreamingFast docs on Substreams-powered subgraphs
1 parent 6ed5cba commit b063822

File tree

4 files changed

+140
-2
lines changed

4 files changed

+140
-2
lines changed

website/pages/en/sps/sps_intro.mdx

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
2+
---
3+
title: Introduction
4+
---
5+
6+
By leveraging a Substreams package (`.yaml`) as a data source, your subgraph gains access to pre-extracted, indexed blockchain data, enabling more efficient and scalable data handling, especially when dealing with large or complex blockchain networks.
7+
8+
This technology opens up more efficient and versatile indexing for diverse blockchain environments. For more information on how to build a Substreams-powered subgraph, [click here](./triggers.mdx). You can also visit the following links for How-To Guides on using code-generation tooling to scaffold your first end to end project quickly:
9+
10+
- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana)
11+
- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm)
12+
- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective)
13+
14+
**Public Substreams packages**
15+
16+
A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain—similar to the `mapping.ts` file in traditional subgraphs.
17+
18+
Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks that can be easily integrated into your subgraph. If you can’t find a suitable Substreams package and want to build your own, click [here](https://thegraph.com/docs/en/substreams/) for detailed instructions on creating a custom package tailored to your needs.
19+

website/pages/en/sps/triggers.mdx

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
2+
----
3+
title: Substreams Triggers
4+
----
5+
6+
Substreams triggers allow you to integrate Substreams data directly into your subgraph. By importing the [Protobuf definitions](https://substreams.streamingfast.io/documentation/develop/creating-protobuf-schemas#protobuf-overview) emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This enables efficient and streamlined data handling within the subgraph framework.
7+
8+
> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps_intro) to scaffold your first project in the devcontainer.
9+
10+
To go through a coded example of a trigger based Subgraph, [click here](./triggers_example).
Lines changed: 107 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,107 @@
1+
Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project.
2+
3+
Consider the following example of a Substreams manifest (`substreams.yaml`), a configuration file similar to the `subgraph.yaml`, using the SPL token program Id:
4+
5+
```graphql
6+
specVersion: v0.1.0
7+
package:
8+
name: my_project_sol
9+
version: v0.1.0
10+
11+
imports: #Pass your spkg of interest
12+
solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg
13+
14+
modules:
15+
16+
- name: map_spl_transfers
17+
use: solana:map_block #Select corresponding modules available within your spkg
18+
initialBlock: 260000082
19+
20+
- name: map_transactions_by_programid
21+
use: solana:solana:transactions_by_programid_without_votes
22+
23+
network: solana-mainnet-beta
24+
25+
params: #Modify the param fields to meet your needs
26+
#For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA
27+
map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE
28+
```
29+
30+
31+
32+
Now see the corresponding subgraph manifest **(`subgraph.yaml`)** using a Substreams package as the data source:
33+
34+
```yaml
35+
specVersion: 1.0.0
36+
description: my-project-sol Substreams-powered-Subgraph
37+
indexerHints:
38+
prune: auto
39+
schema:
40+
file: ./schema.graphql
41+
dataSources:
42+
- kind: substreams
43+
name: my_project_sol
44+
network: solana-mainnet-beta
45+
source:
46+
package:
47+
moduleName: map_spl_transfers
48+
file: ./my-project-sol-v0.1.0.spkg
49+
mapping:
50+
apiVersion: 0.0.7
51+
kind: substreams/graph-entities
52+
file: ./src/mappings.ts
53+
handler: handleTriggers
54+
```
55+
56+
Once your manifests are created, define in the `schema.graphql` the data fields you’d like saved in your subgraph entities:
57+
58+
```graphql
59+
type MyTransfer @entity {
60+
id: ID!
61+
amount: String!
62+
source: String!
63+
designation: String!
64+
signers: [String!]!
65+
}
66+
```
67+
68+
The Protobuf object is generated in AssemblyScript by running `npm run protogen` after running `substreams codegen subgraph` in the devcontainer, so you can import it in the subgraph code. Then transform your decoded Substreams data within the `src/mappings.ts` file, just like in a standard subgraph:
69+
70+
```tsx
71+
import { Protobuf } from "as-proto/assembly";
72+
import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events";
73+
import { MyTransfer } from "../generated/schema";
74+
75+
export function handleTriggers(bytes: Uint8Array): void {
76+
const input: protoEvents = Protobuf.decode<protoEvents>(bytes, protoEvents.decode);
77+
78+
for (let i=0; i<input.data.length; i++) {
79+
const event = input.data[i];
80+
81+
if (event.transfer != null) {
82+
let entity_id: string = `${event.txnId}-${i}`;
83+
const entity = new MyTransfer(entity_id);
84+
entity.amount = (event.transfer!.instruction!.amount).toString();
85+
entity.source = event.transfer!.accounts!.source;
86+
entity.designation = event.transfer!.accounts!.destination;
87+
88+
if (event.transfer!.accounts!.signer!.single != null){
89+
entity.signers = [event.transfer!.accounts!.signer!.single.signer];
90+
}
91+
else if (event.transfer!.accounts!.signer!.multisig != null) {
92+
entity.signers = event.transfer!.accounts!.signer!.multisig!.signers;
93+
}
94+
entity.save();
95+
}
96+
}
97+
}
98+
```
99+
100+
Here's what you’re seeing in the `mappings.ts`:
101+
102+
1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object
103+
2. Looping over the transactions
104+
3. Create a new subgraph entity for every transaction
105+
106+
> Note: It's beneficial to have more of your logic in Substreams, as it allows for a parallelized model, whereas triggers are linearly consumed in `graph-node`.
107+
>

website/pages/en/substreams.mdx

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,9 +4,11 @@ title: Substreams
44

55
![Substreams Logo](/img/substreams-logo.png)
66

7-
Substreams is a powerful blockchain indexing technology developed for The Graph Network. It enables developers to write Rust modules, compose data streams alongside the community, and provide extremely high-performance indexing due to parallelization in a streaming-first approach.
7+
Substreams is a powerful blockchain indexing technology designed to enhance performance and scalability within The Graph Network. It offers the following features:
88

9-
With Substreams, developers can quickly extract data from different blockchains (Ethereum, BNB, Solana, ect.) and send it to various locations of their choice, such as a Postgres database, a Mongo database, or a Subgraph. Additionally, Substreams packages enable developers to specify which data they want to extract from the blockchain.
9+
- **Accelerated Indexing**: Substreams reduces subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing.
10+
- **Multi-Chain Support**: Substreams expand indexing capabilities beyond EVM-based chains, supporting ecosystems like Solana, Injective, Starknet, and Vara.
11+
- **Multi-Sink Support:** Subgraph, Postgres database, Clickhouse, Mongo database
1012

1113
## How Substreams Works in 4 Steps
1214

0 commit comments

Comments
 (0)