Skip to content

Commit 48598b0

Browse files
Giuliano-1idalithbbenface
authored
Updates to Substreams docs and to New Chain Integrations (#849)
* Updates to Substreams docs and to New Chain Integrations * First round of copy, formatting, and build edits. * quick-start, intro, and publishing done * Dev Container, Solana, Trans copy edits done. Found several broken links need to be fixed, but need to identify source. * Solana Account Changes & Sinks done (more broken links need fixing) * adding sps section (as requested) & some copy edits * Updating copy & some links * adjusting for all languages * removing one solana doc * Update website/pages/en/_meta.js Co-authored-by: Benoît Rouleau <[email protected]> * fixes * Updating sps links * Fixed links * fixed minimal explanation * fix * fix to add studio link --------- Co-authored-by: Idalith Bustos <[email protected]> Co-authored-by: benface <[email protected]> Co-authored-by: Idalith <[email protected]>
1 parent db4fd82 commit 48598b0

File tree

458 files changed

+14422
-11171
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

458 files changed

+14422
-11171
lines changed

website/pages/ar/sps/_meta.js

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
import meta from '../../en/sps/_meta.js'
2+
3+
export default {
4+
...meta,
5+
}

website/pages/ar/sps/introduction.mdx

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
---
2+
title: Introduction to Substreams-Powered Subgraphs
3+
---
4+
5+
Boost your subgraph’s efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data.
6+
7+
## Overview
8+
9+
Use a Substreams package (`.spkg`) as a data source, to give your subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks.
10+
11+
### Specifics
12+
13+
There are two methods of enabling this technology:
14+
15+
1. **Using Substreams [triggers](/substreams/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph.
16+
17+
2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](docs/en/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your subgraph entities.
18+
19+
You can choose where to place your logic, either in the subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node.
20+
21+
### Additional Resources
22+
23+
Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly:
24+
25+
- [Solana](https://docs.substreams.dev/tutorials/intro-to-tutorials/solana)
26+
- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm)
27+
- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet)
28+
- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective)
29+
- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra)

website/pages/en/substreams/sps/sps-faq.mdx renamed to website/pages/ar/sps/sps-faq.mdx

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -4,15 +4,17 @@ title: Substreams-Powered Subgraphs FAQ
44

55
## What are Substreams?
66

7-
Developed by [StreamingFast](https://www.streamingfast.io/), Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. Substreams allow you to refine and shape blockchain data for fast and seamless digestion by end-user applications. More specifically, Substreams is a blockchain-agnostic, parallelized, and streaming-first engine, serving as a blockchain data transformation layer. Powered by the [Firehose](https://firehose.streamingfast.io/), it ​​enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) their data anywhere.
7+
Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications.
88

9-
Go to the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams.
9+
Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/sinks/) their data anywhere.
10+
11+
Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams.
1012

1113
## What are Substreams-powered subgraphs?
1214

13-
[Substreams-powered subgraphs](/subgraphs/cookbook/substreams-powered-subgraphs/) combine the power of Substreams with the queryability of subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs), which are compatible with subgraph entities.
15+
[Substreams-powered subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with subgraph entities.
1416

15-
If you are already familiar with subgraph development, then note that Substreams-powered subgraphs can then be queried, just as if it had been produced by the AssemblyScript transformation layer, with all the Subgraph benefits, like providing a dynamic and flexible GraphQL API.
17+
If you are already familiar with subgraph development, note that Substreams-powered subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of subgraphs, including a dynamic and flexible GraphQL API.
1618

1719
## How are Substreams-powered subgraphs different from subgraphs?
1820

website/pages/en/substreams/sps/triggers.mdx renamed to website/pages/ar/sps/triggers.mdx

Lines changed: 14 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,15 @@
22
title: Substreams Triggers
33
---
44

5-
Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework.
5+
Use Custom Triggers and enable the full use GraphQL.
66

7-
> Note: If you haven’t already, visit one of the How-To Guides found [here](/substreams/sps/introduction/) to scaffold your first project in the Development Container.
7+
## Overview
8+
9+
Custom Triggers allow you to send data directly into your subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer.
10+
11+
By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This ensures efficient and streamlined data management within the subgraph framework.
12+
13+
### Defining `handleTransactions`
814

915
The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created.
1016

@@ -28,10 +34,14 @@ export function handleTransactions(bytes: Uint8Array): void {
2834
}
2935
```
3036

31-
Here's what youre seeing in the `mappings.ts` file:
37+
Here's what you're seeing in the `mappings.ts` file:
3238

3339
1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object
3440
2. Looping over the transactions
3541
3. Create a new subgraph entity for every transaction
3642

37-
To go through a detailed example of a trigger-based subgraph, [click here](/substreams/sps/tutorial/).
43+
To go through a detailed example of a trigger-based subgraph, [check out the tutorial](/sps/tutorial/).
44+
45+
### Additional Resources
46+
47+
To scaffold your first project in the Development Container, check out one of the [How-To Guides](/sps/introduction/).

website/pages/de/substreams/sps/tutorial.mdx renamed to website/pages/ar/sps/tutorial.mdx

Lines changed: 26 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -2,16 +2,20 @@
22
title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana'
33
---
44

5-
## Prerequisites
5+
Successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token.
6+
7+
## Get Started
8+
9+
For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial)
10+
11+
### Prerequisites
612

713
Before starting, make sure to:
814

915
- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container.
1016
- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs.
1117

12-
## Step 1: Initialize Your Project
13-
14-
<VideoEmbed youtube="RmKi-Nq9E_A" />
18+
### Step 1: Initialize Your Project
1519

1620
1. Open your Dev Container and run the following command to initialize your project:
1721

@@ -20,7 +24,6 @@ Before starting, make sure to:
2024
```
2125

2226
2. Select the "minimal" project option.
23-
2427
3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID:
2528

2629
```yaml
@@ -47,7 +50,7 @@ params: # Modify the param fields to meet your needs
4750
map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE
4851
```
4952
50-
## Step 2: Generate the Subgraph Manifest
53+
### Step 2: Generate the Subgraph Manifest
5154
5255
Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container:
5356
@@ -74,9 +77,11 @@ dataSources:
7477
handler: handleTriggers
7578
```
7679
77-
## Step 3: Define Entities in `schema.graphql`
80+
### Step 3: Define Entities in `schema.graphql`
7881

79-
Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example:
82+
Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file.
83+
84+
Here is an example:
8085

8186
```graphql
8287
type MyTransfer @entity {
@@ -90,9 +95,11 @@ type MyTransfer @entity {
9095

9196
This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`.
9297

93-
## Step 4: Handle Substreams Data in `mappings.ts`
98+
### Step 4: Handle Substreams Data in `mappings.ts`
99+
100+
With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory.
94101

95-
With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id:
102+
The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id:
96103

97104
```ts
98105
import { Protobuf } from 'as-proto/assembly'
@@ -123,7 +130,7 @@ export function handleTriggers(bytes: Uint8Array): void {
123130
}
124131
```
125132

126-
## Step 5: Generate Protobuf Files
133+
### Step 5: Generate Protobuf Files
127134

128135
To generate Protobuf objects in AssemblyScript, run the following command:
129136

@@ -133,8 +140,14 @@ npm run protogen
133140

134141
This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler.
135142

136-
## Conclusion
143+
### Conclusion
144+
145+
Congratulations! You've successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case.
146+
147+
### Video Tutorial
148+
149+
<VideoEmbed youtube="RmKi-Nq9E_A" />
137150

138-
You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case.
151+
### Additional Resources
139152

140153
For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana).

0 commit comments

Comments
 (0)