Skip to content

Commit 34b2a0c

Browse files
committed
Addressing feedback
1 parent 4b0f462 commit 34b2a0c

File tree

6 files changed

+133
-96
lines changed

6 files changed

+133
-96
lines changed

website/pages/en/new-chain-integration.mdx

Lines changed: 1 addition & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -76,33 +76,4 @@ Graph Node should be syncing the deployed subgraph if there are no errors. Give
7676

7777
## Substreams-powered Subgraphs
7878

79-
For StreamingFast-led Firehose/Substreams integrations, basic support for foundational Substreams modules (e.g. decoded transactions, logs and smart-contract events) and Substreams-powered subgraph codegen tools are included (check out [Injective](https://substreams.streamingfast.io/documentation/intro-getting-started/intro-injective/injective-first-sps) for an example).
80-
81-
There are two options to consume Substreams data through a subgraph:
82-
83-
- **Using Substreams triggers:** Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph.
84-
- **Using EntityChanges:** By writing more of the logic into Substreams, you can consume the module's output directly into `graph-node`. In `graph-node`, you can use the Substreams data to create your subgraph entities.
85-
86-
It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in `graph-node`. Consider the following example implementing a subgraph handler:
87-
88-
```ts
89-
export function handleTransactions(bytes: Uint8Array): void {
90-
let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1.
91-
if (transactions.length == 0) {
92-
log.info('No transactions found', [])
93-
return
94-
}
95-
96-
for (let i = 0; i < transactions.length; i++) {
97-
// 2.
98-
let transaction = transactions[i]
99-
100-
let entity = new Transaction(transaction.hash) // 3.
101-
entity.from = transaction.from
102-
entity.to = transaction.to
103-
entity.save()
104-
}
105-
}
106-
```
107-
108-
The `handleTransactions` function is a subgraph handler that receives the raw Substreams bytes as parameter and decodes them into a `Transactions` object. Then, for every transaction, a new subgraph entity is created. For more information about Substreams triggers, visit the [StreamingFast documentation](https://substreams.streamingfast.io/documentation/consume/subgraph/triggers) or check out community modules at [substreams.dev](https://substreams.dev/).
79+
For StreamingFast-led Firehose/Substreams integrations, basic support for foundational Substreams modules (e.g. decoded transactions, logs and smart-contract events) and Substreams codegen tools are included. These tools enable the ability to enable [Substreams-powered subgraphs](./sps/sps-intro). Follow the [How-To Guide](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application) and run `substreams codegen subgraph` to experience the codegen tools for yourself.

website/pages/en/sps/_meta.js

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
export default {
2-
'sps-intro': '',
2+
'sps-intro': 'Introduction',
33
triggers: '',
44
'triggers-example': '',
55
}

website/pages/en/sps/sps-intro.mdx

Lines changed: 12 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,20 @@
11
---
2-
title: Introduction
2+
title: Introduction to Substreams-powered Subgraphs
33
---
44

5-
By leveraging a Substreams package (`.yaml`) as a data source, your subgraph gains access to pre-extracted, indexed blockchain data, enabling more efficient and scalable data handling, especially when dealing with large or complex blockchain networks.
5+
By using a Substreams package (`.spkg`) as a data source, your subgraph gains access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks.
66

7-
This technology opens up more efficient and versatile indexing for diverse blockchain environments. For more information on how to build a Substreams-powered subgraph, [click here](./triggers.mdx). You can also visit the following links for How-To Guides on using code-generation tooling to scaffold your first end to end project quickly:
7+
There are two methods of enabling this technology:
88

9-
- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana)
10-
- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm)
11-
- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective)
9+
Using Substreams [triggers](./triggers): Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph.
10+
11+
Using [Entity Changes](https://substreams.streamingfast.io/documentation/consume/subgraph/graph-out): By writing more of the logic into Substreams, you can consume the module's output directly into graph-node. In graph-node, you can use the Substreams data to create your subgraph entities.
1212

13-
**Public Substreams packages**
13+
It is really a matter of where you put your logic, in the subgraph or the Substreams. Keep in mind that having more of your logic in Substreams benefits from a parallelized model, whereas triggers will be linearly consumed in graph-node.
1414

15-
A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain—similar to the `mapping.ts` file in traditional subgraphs.
1615

17-
Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks that can be easily integrated into your subgraph. If you can’t find a suitable Substreams package and want to build your own, click [here](https://thegraph.com/docs/en/substreams/) for detailed instructions on creating a custom package tailored to your needs.
16+
Visit the following links for How-To Guides on using code-generation tooling to build your first end-to-end project quickly:
17+
18+
- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana)
19+
- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm)
20+
- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective)
Lines changed: 83 additions & 50 deletions
Original file line numberDiff line numberDiff line change
@@ -1,52 +1,70 @@
11
---
2-
title: Example Susbtreams Trigger
2+
title: Tutorial: Set Up a Substreams-Powered Subgraph on Solana
33
---
44

5-
Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project.
5+
## Prerequisites
66

7-
Consider the following example of a Substreams manifest (`substreams.yaml`), a configuration file similar to the `subgraph.yaml`, using the SPL token program Id:
7+
Before starting, make sure to:
88

9-
```graphql
9+
- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container.
10+
- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs.
11+
12+
## Step 1: Initialize Your Project
13+
14+
1. Open your Dev Container and run the following command to initialize your project:
15+
16+
```bash
17+
substreams init
18+
```
19+
20+
2. Select the "minimal" project option.
21+
3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID:
22+
23+
```yaml
1024
specVersion: v0.1.0
1125
package:
1226
name: my_project_sol
1327
version: v0.1.0
1428

15-
imports: #Pass your spkg of interest
29+
imports: # Pass your spkg of interest
1630
solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg
1731

1832
modules:
19-
33+
2034
- name: map_spl_transfers
21-
use: solana:map_block #Select corresponding modules available within your spkg
35+
use: solana:map_block # Select corresponding modules available within your spkg
2236
initialBlock: 260000082
23-
37+
2438
- name: map_transactions_by_programid
25-
use: solana:solana:transactions_by_programid_without_votes
39+
use: solana:solana:transactions_by_programid_without_votes
2640

2741
network: solana-mainnet-beta
2842

29-
params: #Modify the param fields to meet your needs
30-
#For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA
43+
params: #M odify the param fields to meet your needs
44+
# For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA
3145
map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE
3246
```
3347
34-
Now see the corresponding subgraph manifest **(`subgraph.yaml`)** using a Substreams package as the data source:
48+
## Step 2: Generate the Subgraph Manifest
49+
50+
Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container:
51+
52+
```bash
53+
substreams codegen subgraph
54+
```
55+
56+
You will generate a`subgraph.yaml` manifest which imports the Substreams package as a data source:
3557

3658
```yaml
37-
specVersion: 1.0.0
38-
description: my-project-sol Substreams-powered-Subgraph
39-
indexerHints:
40-
prune: auto
41-
schema:
42-
file: ./schema.graphql
59+
...
60+
4361
dataSources:
4462
- kind: substreams
4563
name: my_project_sol
4664
network: solana-mainnet-beta
4765
source:
4866
package:
49-
moduleName: map_spl_transfers
67+
moduleName: map_spl_transfers # Module defined in the substreams.yaml
5068
file: ./my-project-sol-v0.1.0.spkg
5169
mapping:
5270
apiVersion: 0.0.7
@@ -55,53 +73,68 @@ dataSources:
5573
handler: handleTriggers
5674
```
5775
58-
Once your manifests are created, define in the `schema.graphql` the data fields you’d like saved in your subgraph entities:
76+
## Step 3: Define Entities in `schema.graphql`
77+
78+
Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example:
5979

6080
```graphql
6181
type MyTransfer @entity {
62-
id: ID!
63-
amount: String!
64-
source: String!
65-
designation: String!
66-
signers: [String!]!
82+
id: ID!
83+
amount: String!
84+
source: String!
85+
designation: String!
86+
signers: [String!]!
6787
}
6888
```
6989

70-
The Protobuf object is generated in AssemblyScript by running `npm run protogen` after running `substreams codegen subgraph` in the devcontainer, so you can import it in the subgraph code. Then transform your decoded Substreams data within the `src/mappings.ts` file, just like in a standard subgraph:
90+
This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`.
7191

72-
```tsx
73-
import { Protobuf } from 'as-proto/assembly'
74-
import { Events as protoEvents } from './pb/sf/solana/spl/token/v1/Events'
75-
import { MyTransfer } from '../generated/schema'
92+
## Step 4: Generate Protobuf Files
7693

77-
export function handleTriggers(bytes: Uint8Array): void {
78-
const input: protoEvents = Protobuf.decode<protoEvents>(bytes, protoEvents.decode)
94+
To generate Protobuf objects in AssemblyScript, run the following command:
95+
96+
```bash
97+
npm run protogen
98+
```
99+
100+
This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler.
101+
102+
## Step 5: Handle Substreams Data in `mappings.ts`
79103

80-
for (let i = 0; i < input.data.length; i++) {
81-
const event = input.data[i]
104+
With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id:
82105

106+
```bash
107+
import { Protobuf } from "as-proto/assembly";
108+
import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events";
109+
import { MyTransfer } from "../generated/schema";
110+
111+
export function handleTriggers(bytes: Uint8Array): void {
112+
const input: protoEvents = Protobuf.decode<protoEvents>(bytes, protoEvents.decode);
113+
114+
for (let i=0; i<input.data.length; i++) {
115+
const event = input.data[i];
116+
83117
if (event.transfer != null) {
84-
let entity_id: string = `${event.txnId}-${i}`
85-
const entity = new MyTransfer(entity_id)
86-
entity.amount = event.transfer!.instruction!.amount.toString()
87-
entity.source = event.transfer!.accounts!.source
88-
entity.designation = event.transfer!.accounts!.destination
89-
90-
if (event.transfer!.accounts!.signer!.single != null) {
91-
entity.signers = [event.transfer!.accounts!.signer!.single.signer]
92-
} else if (event.transfer!.accounts!.signer!.multisig != null) {
93-
entity.signers = event.transfer!.accounts!.signer!.multisig!.signers
118+
let entity_id: string = `${event.txnId}-${i}`;
119+
const entity = new MyTransfer(entity_id);
120+
entity.amount = (event.transfer!.instruction!.amount).toString();
121+
entity.source = event.transfer!.accounts!.source;
122+
entity.designation = event.transfer!.accounts!.destination;
123+
124+
if (event.transfer!.accounts!.signer!.single != null){
125+
entity.signers = [event.transfer!.accounts!.signer!.single.signer];
126+
}
127+
else if (event.transfer!.accounts!.signer!.multisig != null) {
128+
entity.signers = event.transfer!.accounts!.signer!.multisig!.signers;
94129
}
95-
entity.save()
130+
entity.save();
96131
}
97132
}
98133
}
99134
```
100135

101-
Here's what you’re seeing in the `mappings.ts`:
136+
## Conclusion
102137

103-
1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object
104-
2. Looping over the transactions
105-
3. Create a new subgraph entity for every transaction
138+
You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case.
106139

107-
> Note: It's beneficial to have more of your logic in Substreams, as it allows for a parallelized model, whereas triggers are linearly consumed in `graph-node`.
140+
For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana).

website/pages/en/sps/triggers.mdx

Lines changed: 31 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,37 @@
1-
----
1+
---
22
title: Substreams Triggers
33
---
44

5-
-
5+
Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework.
6+
7+
> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps-intro) to scaffold your first project in the Development Container.
8+
9+
The following example demonstrates how to implement a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created.
10+
11+
```tsx
12+
export function handleTransactions(bytes: Uint8Array): void {
13+
let transactions = assembly.eth.transaction.v1.Transactions.decode(bytes.buffer).trasanctions // 1.
14+
if (transactions.length == 0) {
15+
log.info('No transactions found', [])
16+
return
17+
}
18+
19+
for (let i = 0; i < transactions.length; i++) {
20+
// 2.
21+
let transaction = transactions[i]
22+
23+
let entity = new Transaction(transaction.hash) // 3.
24+
entity.from = transaction.from
25+
entity.to = transaction.to
26+
entity.save()
27+
}
28+
}
29+
```
630

7-
Substreams triggers allow you to integrate Substreams data directly into your subgraph. By importing the [Protobuf definitions](https://substreams.streamingfast.io/documentation/develop/creating-protobuf-schemas#protobuf-overview) emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This enables efficient and streamlined data handling within the subgraph framework.
31+
Here's what you’re seeing in the `mappings.ts` file:
832

9-
> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps_intro) to scaffold your first project in the devcontainer.
33+
1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object
34+
2. Looping over the transactions
35+
3. Create a new subgraph entity for every transaction
1036

11-
To go through a coded example of a trigger based Subgraph, [click here](./triggers_example).
37+
To go through a detailed example of a trigger-based subgraph, [click here](./triggers-example).

website/pages/en/substreams.mdx

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ title: Substreams
66

77
Substreams is a powerful blockchain indexing technology designed to enhance performance and scalability within The Graph Network. It offers the following features:
88

9-
- **Accelerated Indexing**: Substreams reduces subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing.
9+
- **Accelerated Indexing**: Substreams reduce subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing.
1010
- **Multi-Chain Support**: Substreams expand indexing capabilities beyond EVM-based chains, supporting ecosystems like Solana, Injective, Starknet, and Vara.
1111
- **Multi-Sink Support:** Subgraph, Postgres database, Clickhouse, Mongo database
1212

@@ -46,3 +46,7 @@ To learn about the latest version of Substreams CLI, which enables developers to
4646
### Expand Your Knowledge
4747

4848
- Take a look at the [Ethereum Explorer Tutorial](https://substreams.streamingfast.io/tutorials/evm) to learn about the basic transformations you can create with Substreams.
49+
50+
### Substreams Registry
51+
52+
A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain, similar to the `mapping.ts` file in traditional subgraphs. Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks.

0 commit comments

Comments
 (0)