Skip to content

Commit 5fffe03

Browse files
Giuliano-1enoldevbenface
authored
Streamingfast/docs (#765)
* StreamingFast docs on Substreams-powered subgraphs * added a title * Add SpS documentation to the menu * Fix format * Addressing feedback * editing the _meta.js * fix * trying to fix prettier * fixing prettier * Fix title * `sps-intro` => `introduction` * Duplicate new pages in every language --------- Co-authored-by: Enol Álvarez <[email protected]> Co-authored-by: benface <[email protected]>
1 parent d3d4099 commit 5fffe03

File tree

100 files changed

+4858
-39
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

100 files changed

+4858
-39
lines changed

website/pages/ar/sps/_meta.js

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
import meta from '../../en/sps/_meta.js'
2+
3+
export default {
4+
...meta,
5+
}

website/pages/ar/sps/introduction.mdx

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
---
2+
title: Introduction to Substreams-powered Subgraphs
3+
---
4+
5+
By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks.
6+
7+
There are two methods of enabling this technology:
8+
9+
Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph.
10+
11+
Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities.
12+
13+
It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node.
14+
15+
Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly:
16+
17+
- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana)
18+
- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm)
19+
- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective)
Lines changed: 137 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,137 @@
1+
---
2+
title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana'
3+
---
4+
5+
## Prerequisites
6+
7+
Before starting, make sure to:
8+
9+
- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container.
10+
- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs.
11+
12+
## Step 1: Initialize Your Project
13+
14+
1. Open your Dev Container and run the following command to initialize your project:
15+
16+
```bash
17+
substreams init
18+
```
19+
20+
2. Select the "minimal" project option.
21+
3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID:
22+
23+
```yaml
24+
specVersion: v0.1.0
25+
package:
26+
name: my_project_sol
27+
version: v0.1.0
28+
29+
imports: # Pass your spkg of interest
30+
solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg
31+
32+
modules:
33+
- name: map_spl_transfers
34+
use: solana:map_block # Select corresponding modules available within your spkg
35+
initialBlock: 260000082
36+
37+
- name: map_transactions_by_programid
38+
use: solana:solana:transactions_by_programid_without_votes
39+
40+
network: solana-mainnet-beta
41+
42+
params: # Modify the param fields to meet your needs
43+
# For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA
44+
map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE
45+
```
46+
47+
## Step 2: Generate the Subgraph Manifest
48+
49+
Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container:
50+
51+
```bash
52+
substreams codegen subgraph
53+
```
54+
55+
You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source:
56+
57+
```yaml
58+
---
59+
dataSources:
60+
- kind: substreams
61+
name: my_project_sol
62+
network: solana-mainnet-beta
63+
source:
64+
package:
65+
moduleName: map_spl_transfers # Module defined in the substreams.yaml
66+
file: ./my-project-sol-v0.1.0.spkg
67+
mapping:
68+
apiVersion: 0.0.7
69+
kind: substreams/graph-entities
70+
file: ./src/mappings.ts
71+
handler: handleTriggers
72+
```
73+
74+
## Step 3: Define Entities in `schema.graphql`
75+
76+
Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example:
77+
78+
```graphql
79+
type MyTransfer @entity {
80+
id: ID!
81+
amount: String!
82+
source: String!
83+
designation: String!
84+
signers: [String!]!
85+
}
86+
```
87+
88+
This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`.
89+
90+
## Step 4: Generate Protobuf Files
91+
92+
To generate Protobuf objects in AssemblyScript, run the following command:
93+
94+
```bash
95+
npm run protogen
96+
```
97+
98+
This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler.
99+
100+
## Step 5: Handle Substreams Data in `mappings.ts`
101+
102+
With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id:
103+
104+
```ts
105+
import { Protobuf } from 'as-proto/assembly'
106+
import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events'
107+
import { MyTransfer } from '../generated/schema'
108+
109+
export function handleTriggers(bytes: Uint8Array): void {
110+
const input: protoEvents = Protobuf.decode<protoEvents>(bytes, protoEvents.decode)
111+
112+
for (let i = 0; i < input.data.length; i++) {
113+
const event = input.data[i]
114+
115+
if (event.transfer != null) {
116+
let entity_id: string = `${event.txnId}-${i}`
117+
const entity = new MyTransfer(entity_id)
118+
entity.amount = event.transfer!.instruction!.amount.toString()
119+
entity.source = event.transfer!.accounts!.source
120+
entity.designation = event.transfer!.accounts!.destination
121+
122+
if (event.transfer!.accounts!.signer!.single != null) {
123+
entity.signers = [event.transfer!.accounts!.signer!.single.signer]
124+
} else if (event.transfer!.accounts!.signer!.multisig != null) {
125+
entity.signers = event.transfer!.accounts!.signer!.multisig!.signers
126+
}
127+
entity.save()
128+
}
129+
}
130+
}
131+
```
132+
133+
## Conclusion
134+
135+
You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case.
136+
137+
For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana).

website/pages/ar/sps/triggers.mdx

Lines changed: 37 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,37 @@
1+
---
2+
title: Substreams Triggers
3+
---
4+
5+
Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework.
6+
7+
> Note: If you haven’t already, visit one of the How-To Guides found [here](./introduction) to scaffold your first project in the Development Container.
8+
9+
The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created.
10+
11+
```tsx
12+
export function handleTransactions(bytes: Uint8Array): void {
13+
let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1.
14+
if (transactions.length == 0) {
15+
log.info('No transactions found', [])
16+
return
17+
}
18+
19+
for (let i = 0; i < transactions.length; i++) {
20+
// 2.
21+
let transaction = transactions[i]
22+
23+
let entity = new Transaction(transaction.hash) // 3.
24+
entity.from = transaction.from
25+
entity.to = transaction.to
26+
entity.save()
27+
}
28+
}
29+
```
30+
31+
Here's what you’re seeing in the `mappings.ts` file:
32+
33+
1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object
34+
2. Looping over the transactions
35+
3. Create a new subgraph entity for every transaction
36+
37+
To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example).

website/pages/cs/sps/_meta.js

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
import meta from '../../en/sps/_meta.js'
2+
3+
export default {
4+
...meta,
5+
}

website/pages/cs/sps/introduction.mdx

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,19 @@
1+
---
2+
title: Introduction to Substreams-powered Subgraphs
3+
---
4+
5+
By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks.
6+
7+
There are two methods of enabling this technology:
8+
9+
Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph.
10+
11+
Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities.
12+
13+
It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node.
14+
15+
Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly:
16+
17+
- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana)
18+
- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm)
19+
- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective)
Lines changed: 137 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,137 @@
1+
---
2+
title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana'
3+
---
4+
5+
## Prerequisites
6+
7+
Before starting, make sure to:
8+
9+
- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container.
10+
- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs.
11+
12+
## Step 1: Initialize Your Project
13+
14+
1. Open your Dev Container and run the following command to initialize your project:
15+
16+
```bash
17+
substreams init
18+
```
19+
20+
2. Select the "minimal" project option.
21+
3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID:
22+
23+
```yaml
24+
specVersion: v0.1.0
25+
package:
26+
name: my_project_sol
27+
version: v0.1.0
28+
29+
imports: # Pass your spkg of interest
30+
solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg
31+
32+
modules:
33+
- name: map_spl_transfers
34+
use: solana:map_block # Select corresponding modules available within your spkg
35+
initialBlock: 260000082
36+
37+
- name: map_transactions_by_programid
38+
use: solana:solana:transactions_by_programid_without_votes
39+
40+
network: solana-mainnet-beta
41+
42+
params: # Modify the param fields to meet your needs
43+
# For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA
44+
map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE
45+
```
46+
47+
## Step 2: Generate the Subgraph Manifest
48+
49+
Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container:
50+
51+
```bash
52+
substreams codegen subgraph
53+
```
54+
55+
You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source:
56+
57+
```yaml
58+
---
59+
dataSources:
60+
- kind: substreams
61+
name: my_project_sol
62+
network: solana-mainnet-beta
63+
source:
64+
package:
65+
moduleName: map_spl_transfers # Module defined in the substreams.yaml
66+
file: ./my-project-sol-v0.1.0.spkg
67+
mapping:
68+
apiVersion: 0.0.7
69+
kind: substreams/graph-entities
70+
file: ./src/mappings.ts
71+
handler: handleTriggers
72+
```
73+
74+
## Step 3: Define Entities in `schema.graphql`
75+
76+
Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example:
77+
78+
```graphql
79+
type MyTransfer @entity {
80+
id: ID!
81+
amount: String!
82+
source: String!
83+
designation: String!
84+
signers: [String!]!
85+
}
86+
```
87+
88+
This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`.
89+
90+
## Step 4: Generate Protobuf Files
91+
92+
To generate Protobuf objects in AssemblyScript, run the following command:
93+
94+
```bash
95+
npm run protogen
96+
```
97+
98+
This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler.
99+
100+
## Step 5: Handle Substreams Data in `mappings.ts`
101+
102+
With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id:
103+
104+
```ts
105+
import { Protobuf } from 'as-proto/assembly'
106+
import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events'
107+
import { MyTransfer } from '../generated/schema'
108+
109+
export function handleTriggers(bytes: Uint8Array): void {
110+
const input: protoEvents = Protobuf.decode<protoEvents>(bytes, protoEvents.decode)
111+
112+
for (let i = 0; i < input.data.length; i++) {
113+
const event = input.data[i]
114+
115+
if (event.transfer != null) {
116+
let entity_id: string = `${event.txnId}-${i}`
117+
const entity = new MyTransfer(entity_id)
118+
entity.amount = event.transfer!.instruction!.amount.toString()
119+
entity.source = event.transfer!.accounts!.source
120+
entity.designation = event.transfer!.accounts!.destination
121+
122+
if (event.transfer!.accounts!.signer!.single != null) {
123+
entity.signers = [event.transfer!.accounts!.signer!.single.signer]
124+
} else if (event.transfer!.accounts!.signer!.multisig != null) {
125+
entity.signers = event.transfer!.accounts!.signer!.multisig!.signers
126+
}
127+
entity.save()
128+
}
129+
}
130+
}
131+
```
132+
133+
## Conclusion
134+
135+
You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case.
136+
137+
For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana).

0 commit comments

Comments
 (0)