Skip to content
Merged
Show file tree
Hide file tree
Changes from 10 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
5 changes: 5 additions & 0 deletions website/pages/ar/sps/_meta.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
import meta from '../../en/sps/_meta.js'

export default {
...meta,
}
29 changes: 29 additions & 0 deletions website/pages/ar/sps/introduction.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
---
title: Introduction to Substreams-Powered Subgraphs
---

Boost your subgraph’s efficiency and scalability by using [Substreams](/substreams/introduction/) to stream pre-indexed blockchain data.

## Overview

Use a Substreams package (`.spkg`) as a data source, to give your subgraph access to a stream of pre-indexed blockchain data. This enables more efficient and scalable data handling, especially with large or complex blockchain networks.

### Specifics

There are two methods of enabling this technology:

1. **Using Substreams [triggers](/substreams/sps/triggers/)**: Consume from any Substreams module by importing the Protobuf model through a subgraph handler and move all your logic into a subgraph. This method creates the subgraph entities directly in the subgraph.

2. **Using [Entity Changes](https://docs.substreams.dev/how-to-guides/sinks/subgraph/graph-out)**: By writing more of the logic into Substreams, you can consume the module's output directly into [graph-node](docs/en/indexing/tooling/graph-node/). In graph-node, you can use the Substreams data to create your subgraph entities.

You can choose where to place your logic, either in the subgraph or Substreams. However, consider what aligns with your data needs, as Substreams has a parallelized model, and triggers are consumed linearly in the graph node.

### Additional Resources

Visit the following links for tutorials on using code-generation tooling to build your first end-to-end Substreams project quickly:

- [Solana](https://docs.substreams.dev/tutorials/intro-to-tutorials/solana)
- [EVM](https://docs.substreams.dev/tutorials/intro-to-tutorials/evm)
- [Starknet](https://docs.substreams.dev/tutorials/intro-to-tutorials/starknet)
- [Injective](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/injective)
- [MANTRA](https://docs.substreams.dev/tutorials/intro-to-tutorials/on-cosmos/mantra)
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,17 @@ title: Substreams-Powered Subgraphs FAQ

## What are Substreams?

Developed by [StreamingFast](https://www.streamingfast.io/), Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. Substreams allow you to refine and shape blockchain data for fast and seamless digestion by end-user applications. More specifically, Substreams is a blockchain-agnostic, parallelized, and streaming-first engine, serving as a blockchain data transformation layer. Powered by the [Firehose](https://firehose.streamingfast.io/), it ​​enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](https://substreams.streamingfast.io/reference-and-specs/manifests#sink) their data anywhere.
Substreams is an exceptionally powerful processing engine capable of consuming rich streams of blockchain data. It allows you to refine and shape blockchain data for fast and seamless digestion by end-user applications.

Go to the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams.
Specifically, it's a blockchain-agnostic, parallelized, and streaming-first engine that serves as a blockchain data transformation layer. It's powered by [Firehose](https://firehose.streamingfast.io/), and enables developers to write Rust modules, build upon community modules, provide extremely high-performance indexing, and [sink](/substreams/developing/sinks/sinks/) their data anywhere.

Substreams is developed by [StreamingFast](https://www.streamingfast.io/). Visit the [Substreams Documentation](/substreams/introduction/) to learn more about Substreams.

## What are Substreams-powered subgraphs?

[Substreams-powered subgraphs](/subgraphs/cookbook/substreams-powered-subgraphs/) combine the power of Substreams with the queryability of subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs), which are compatible with subgraph entities.
[Substreams-powered subgraphs](/sps/introduction/) combine the power of Substreams with the queryability of subgraphs. When publishing a Substreams-powered Subgraph, the data produced by the Substreams transformations, can [output entity changes](https://github.com/streamingfast/substreams-sink-entity-changes/blob/develop/substreams-entity-change/src/tables.rs) that are compatible with subgraph entities.

If you are already familiar with subgraph development, then note that Substreams-powered subgraphs can then be queried, just as if it had been produced by the AssemblyScript transformation layer, with all the Subgraph benefits, like providing a dynamic and flexible GraphQL API.
If you are already familiar with subgraph development, note that Substreams-powered subgraphs can then be queried just as if it had been produced by the AssemblyScript transformation layer. This provides all the benefits of subgraphs, including a dynamic and flexible GraphQL API.

## How are Substreams-powered subgraphs different from subgraphs?

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,15 @@
title: Substreams Triggers
---

Custom triggers allow you to send data directly into your subgraph mappings file and entities (similar to tables and fields), enabling full use of the GraphQL layer. By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data within your subgraph’s handler, ensuring efficient and streamlined data management within the subgraph framework.
Use Custom Triggers and enable the full use GraphQL.

> Note: If you haven’t already, visit one of the How-To Guides found [here](/substreams/sps/introduction/) to scaffold your first project in the Development Container.
## Overview

Custom Triggers allow you to send data directly into your subgraph mappings file and entities, which are similar to tables and fields. This enables you to fully use the GraphQL layer.

By importing the Protobuf definitions emitted by your Substreams module, you can receive and process this data in your subgraph's handler. This ensures efficient and streamlined data management within the subgraph framework.

### Defining `handleTransactions`

The following code demonstrates how to define a `handleTransactions` function in a subgraph handler. This function receives raw Substreams bytes as a parameter and decodes them into a `Transactions` object. For each transaction, a new subgraph entity is created.

Expand All @@ -28,10 +34,14 @@ export function handleTransactions(bytes: Uint8Array): void {
}
```

Here's what youre seeing in the `mappings.ts` file:
Here's what you're seeing in the `mappings.ts` file:

1. The bytes containing Substreams data are decoded into the generated `Transactions` object, this object is used like any other AssemblyScript object
2. Looping over the transactions
3. Create a new subgraph entity for every transaction

To go through a detailed example of a trigger-based subgraph, [click here](/substreams/sps/tutorial/).
To go through a detailed example of a trigger-based subgraph, [check out the tutorial](/sps/tutorial/).

### Additional Resources

To scaffold your first project in the Development Container, check out one of the [How-To Guides](/sps/introduction/).
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,20 @@
title: 'Tutorial: Set Up a Substreams-Powered Subgraph on Solana'
---

## Prerequisites
Successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token.

## Get Started

For a video tutorial, check out [How to Index Solana with a Substreams-powered Subgraph](/sps/tutorial/#video-tutorial)

### Prerequisites

Before starting, make sure to:

- Complete the [Getting Started Guide](https://github.com/streamingfast/substreams-starter) to set up your development environment using a Dev Container.
- Be familiar with The Graph and basic blockchain concepts such as transactions and Protobufs.

## Step 1: Initialize Your Project

<VideoEmbed youtube="RmKi-Nq9E_A" />
### Step 1: Initialize Your Project

1. Open your Dev Container and run the following command to initialize your project:

Expand All @@ -20,7 +24,6 @@ Before starting, make sure to:
```

2. Select the "minimal" project option.

3. Replace the contents of the generated `substreams.yaml` file with the following configuration, which filters transactions for the Orca account on the SPL token program ID:

```yaml
Expand All @@ -47,7 +50,7 @@ params: # Modify the param fields to meet your needs
map_spl_transfers: token_contract:orcaEKTdK7LKz57vaAYr9QeNsVEPfiu6QeMU1kektZE
```

## Step 2: Generate the Subgraph Manifest
### Step 2: Generate the Subgraph Manifest

Once the project is initialized, generate a subgraph manifest by running the following command in the Dev Container:

Expand All @@ -74,9 +77,11 @@ dataSources:
handler: handleTriggers
```

## Step 3: Define Entities in `schema.graphql`
### Step 3: Define Entities in `schema.graphql`

Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file. Here is an example:
Define the fields you want to save in your subgraph entities by updating the `schema.graphql` file.

Here is an example:

```graphql
type MyTransfer @entity {
Expand All @@ -90,9 +95,11 @@ type MyTransfer @entity {

This schema defines a `MyTransfer` entity with fields such as `id`, `amount`, `source`, `designation`, and `signers`.

## Step 4: Handle Substreams Data in `mappings.ts`
### Step 4: Handle Substreams Data in `mappings.ts`

With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory.

With the Protobuf objects generated, you can now handle the decoded Substreams data in your `mappings.ts` file found in the `./src` directory. The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id:
The example below demonstrates how to extract to subgraph entities the non-derived transfers associated to the Orca account id:

```ts
import { Protobuf } from 'as-proto/assembly'
Expand Down Expand Up @@ -123,7 +130,7 @@ export function handleTriggers(bytes: Uint8Array): void {
}
```

## Step 5: Generate Protobuf Files
### Step 5: Generate Protobuf Files

To generate Protobuf objects in AssemblyScript, run the following command:

Expand All @@ -133,8 +140,14 @@ npm run protogen

This command converts the Protobuf definitions into AssemblyScript, allowing you to use them in the subgraph's handler.

## Conclusion
### Conclusion

Congratulations! You've successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can take the next step by customizing your schema, mappings, and modules to fit your specific use case.

### Video Tutorial

<VideoEmbed youtube="RmKi-Nq9E_A" />

You’ve successfully set up a trigger-based Substreams-powered subgraph for a Solana SPL token. You can now further customize your schema, mappings, and modules to suit your specific use case.
### Additional Resources

For more advanced customization and optimizations, check out the official [Substreams documentation](https://substreams.streamingfast.io/tutorials/solana).
Loading
Loading