Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions website/pages/en/_meta.js
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ export default {
title: 'Substreams',
},
substreams: '',
sps: 'Substreams-powered Subgraphs',
'---4': {
type: 'separator',
},
Expand Down
5 changes: 5 additions & 0 deletions website/pages/en/sps/_meta.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
export default {
'sps-intro': '',
'triggers': '',
'triggers-example': ''
}
18 changes: 18 additions & 0 deletions website/pages/en/sps/sps-intro.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
---
title: Introduction
---

By leveraging a Substreams package (`.yaml`) as a data source, your subgraph gains access to pre-extracted, indexed blockchain data, enabling more efficient and scalable data handling, especially when dealing with large or complex blockchain networks.

This technology opens up more efficient and versatile indexing for diverse blockchain environments. For more information on how to build a Substreams-powered subgraph, [click here](./triggers.mdx). You can also visit the following links for How-To Guides on using code-generation tooling to scaffold your first end to end project quickly:

- [Solana](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/solana)
- [EVM](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/evm)
- [Injective](https://substreams.streamingfast.io/documentation/how-to-guides/intro-your-first-application/injective)

**Public Substreams packages**

A Substreams package is a precompiled binary file that defines the specific data you want to extract from the blockchain—similar to the `mapping.ts` file in traditional subgraphs.

Visit [substreams.dev](https://substreams.dev/) to explore a growing collection of ready-to-use Substreams packages across various blockchain networks that can be easily integrated into your subgraph. If you can’t find a suitable Substreams package and want to build your own, click [here](https://thegraph.com/docs/en/substreams/) for detailed instructions on creating a custom package tailored to your needs.

111 changes: 111 additions & 0 deletions website/pages/en/sps/triggers-example.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
---
title: Example Susbtreams Trigger
---

Here you’ll walk through a Solana based example for setting up your Substreams-powered subgraph project. If you haven’t already, first check out the [Getting Started Guide](https://github.com/streamingfast/substreams/blob/enol/how-to-guides/docs/new/how-to-guides/intro-how-to-guides.md) for more information on how to initialize your project.

Consider the following example of a Substreams manifest (`substreams.yaml`), a configuration file similar to the `subgraph.yaml`, using the SPL token program Id:

```graphql
specVersion: v0.1.0
package:
name: my_project_sol
version: v0.1.0

imports: #Pass your spkg of interest
solana: https://github.com/streamingfast/substreams-solana-spl-token/raw/master/tokens/solana-spl-token-v0.1.0.spkg

modules:

- name: map_spl_transfers
use: solana:map_block #Select corresponding modules available within your spkg
initialBlock: 260000082

- name: map_transactions_by_programid
use: solana:solana:transactions_by_programid_without_votes

network: solana-mainnet-beta

params: #Modify the param fields to meet your needs
#For program_id: TokenkegQfeZyiNwAJbNbGKPFXCWuBvf9Ss623VQ5DA
map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE
```



Now see the corresponding subgraph manifest **(`subgraph.yaml`)** using a Substreams package as the data source:

```yaml
specVersion: 1.0.0
description: my-project-sol Substreams-powered-Subgraph
indexerHints:
prune: auto
schema:
file: ./schema.graphql
dataSources:
- kind: substreams
name: my_project_sol
network: solana-mainnet-beta
source:
package:
moduleName: map_spl_transfers
file: ./my-project-sol-v0.1.0.spkg
mapping:
apiVersion: 0.0.7
kind: substreams/graph-entities
file: ./src/mappings.ts
handler: handleTriggers
```

Once your manifests are created, define in the `schema.graphql` the data fields you’d like saved in your subgraph entities:

```graphql
type MyTransfer @entity {
id: ID!
amount: String!
source: String!
designation: String!
signers: [String!]!
}
```

The Protobuf object is generated in AssemblyScript by running `npm run protogen` after running `substreams codegen subgraph` in the devcontainer, so you can import it in the subgraph code. Then transform your decoded Substreams data within the `src/mappings.ts` file, just like in a standard subgraph:

```tsx
import { Protobuf } from "as-proto/assembly";
import { Events as protoEvents } from "./pb/sf/solana/spl/token/v1/Events";
import { MyTransfer } from "../generated/schema";

export function handleTriggers(bytes: Uint8Array): void {
const input: protoEvents = Protobuf.decode<protoEvents>(bytes, protoEvents.decode);

for (let i=0; i<input.data.length; i++) {
const event = input.data[i];

if (event.transfer != null) {
let entity_id: string = `${event.txnId}-${i}`;
const entity = new MyTransfer(entity_id);
entity.amount = (event.transfer!.instruction!.amount).toString();
entity.source = event.transfer!.accounts!.source;
entity.designation = event.transfer!.accounts!.destination;

if (event.transfer!.accounts!.signer!.single != null){
entity.signers = [event.transfer!.accounts!.signer!.single.signer];
}
else if (event.transfer!.accounts!.signer!.multisig != null) {
entity.signers = event.transfer!.accounts!.signer!.multisig!.signers;
}
entity.save();
}
}
}
```

Here's what you’re seeing in the `mappings.ts`:

1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object
2. Looping over the transactions
3. Create a new subgraph entity for every transaction

> Note: It's beneficial to have more of your logic in Substreams, as it allows for a parallelized model, whereas triggers are linearly consumed in `graph-node`.
>
9 changes: 9 additions & 0 deletions website/pages/en/sps/triggers.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
----
title: Substreams Triggers
----

Substreams triggers allow you to integrate Substreams data directly into your subgraph. By importing the [Protobuf definitions](https://substreams.streamingfast.io/documentation/develop/creating-protobuf-schemas#protobuf-overview) emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This enables efficient and streamlined data handling within the subgraph framework.

> Note: If you haven’t already, visit one of the How-To Guides found [here](./sps_intro) to scaffold your first project in the devcontainer.

To go through a coded example of a trigger based Subgraph, [click here](./triggers_example).
6 changes: 4 additions & 2 deletions website/pages/en/substreams.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,11 @@ title: Substreams

![Substreams Logo](/img/substreams-logo.png)

Substreams is a powerful blockchain indexing technology developed for The Graph Network. It enables developers to write Rust modules, compose data streams alongside the community, and provide extremely high-performance indexing due to parallelization in a streaming-first approach.
Substreams is a powerful blockchain indexing technology designed to enhance performance and scalability within The Graph Network. It offers the following features:

With Substreams, developers can quickly extract data from different blockchains (Ethereum, BNB, Solana, ect.) and send it to various locations of their choice, such as a Postgres database, a Mongo database, or a Subgraph. Additionally, Substreams packages enable developers to specify which data they want to extract from the blockchain.
- **Accelerated Indexing**: Substreams reduces subgraph indexing time thanks to a parallelized engine, enabling faster data retrieval and processing.
- **Multi-Chain Support**: Substreams expand indexing capabilities beyond EVM-based chains, supporting ecosystems like Solana, Injective, Starknet, and Vara.
- **Multi-Sink Support:** Subgraph, Postgres database, Clickhouse, Mongo database

## How Substreams Works in 4 Steps

Expand Down
Loading