Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
8c785ff
Add transmission example
Popov72 May 20, 2025
9ecbff3
Add doc for the motion blur post-process
Popov72 May 24, 2025
3c5c872
Add geometry texture VAT example to frame graph doc
Popov72 Jun 10, 2025
26e9d11
Merge branch 'master' of https://github.com/BabylonJS/Documentation i…
Popov72 Jun 10, 2025
cc09ab9
Merge branch 'master' of https://github.com/BabylonJS/Documentation i…
Popov72 Oct 21, 2025
ed8ee83
Update the TAA block doc
Popov72 Oct 22, 2025
1b19cf8
Add an example to create a custom post-process
Popov72 Oct 24, 2025
0ed481d
Merge branch 'master' of https://github.com/BabylonJS/Documentation i…
Popov72 Oct 24, 2025
0fd6516
Add doc about the class framework
Popov72 Oct 28, 2025
feaba5c
Merge branch 'master' of https://github.com/BabylonJS/Documentation i…
Popov72 Oct 28, 2025
3fa9015
Merge branch 'master' of https://github.com/BabylonJS/Documentation i…
Popov72 Nov 1, 2025
e8410aa
Add more frame graph doc
Popov72 Nov 1, 2025
a050dc2
More update
Popov72 Nov 5, 2025
520b8f8
More docs
Popov72 Nov 8, 2025
b3fa17a
Merge branch 'master' of https://github.com/BabylonJS/Documentation i…
Popov72 Nov 8, 2025
2e340f7
Fix typo
Popov72 Nov 9, 2025
cc8534a
Change structure
Popov72 Nov 10, 2025
5bb36b7
Add description for new properties
Popov72 Nov 12, 2025
0d9058c
Fix doc wrt cull passes
Popov72 Nov 13, 2025
6f2d6a4
Address comments
Popov72 Nov 13, 2025
cb09614
Address comments
Popov72 Nov 13, 2025
e571de1
Fix PG
Popov72 Nov 13, 2025
84062d5
Revert change
Popov72 Nov 17, 2025
a6de89a
Fix PGs
Popov72 Nov 18, 2025
c91e7a9
Fix pg
Popov72 Nov 18, 2025
8fef068
Update doc according to latest changes in code
Popov72 Nov 20, 2025
df5a72f
More updates
Popov72 Nov 20, 2025
b827750
Fix VAT PGs
Popov72 Nov 21, 2025
2a1d7e9
Fix doc VAT
Popov72 Nov 21, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 18 additions & 2 deletions configuration/structure.json
Original file line number Diff line number Diff line change
Expand Up @@ -659,16 +659,32 @@
},
"content": "features/featuresDeepDive/frameGraph/frameGraphBasicConcepts"
},
"frameGraphClassFramework": {
"friendlyName": "Frame Graph Framework Description",
"children": {
"frameGraphClassOverview": {
"friendlyName": "Introduction to Frame Graph classes",
"children": {},
"content": "features/featuresDeepDive/frameGraph/frameGraphClassFramework/frameGraphClassOverview"
},
"frameGraphTaskList": {
"friendlyName": "Frame Graph Task List",
"children": {},
"content": "features/featuresDeepDive/frameGraph/frameGraphClassFramework/frameGraphTaskList"
}
},
"content": "features/featuresDeepDive/frameGraph/frameGraphClassFramework"
},
"frameGraphBlocks": {
"friendlyName": "Render Graph Blocks",
"friendlyName": "Node Render Graph Blocks",
"children": {
"frameGraphBlocksGeneralNotes": {
"friendlyName": "General notes about Render Graph Blocks",
"children": {},
"content": "features/featuresDeepDive/frameGraph/frameGraphBlocks/frameGraphBlocksGeneralNotes"
},
"frameGraphBlocksDescription": {
"friendlyName": "Description of the Render Graph Blocks",
"friendlyName": "Render Graph Blocks Description",
"children": {},
"content": "features/featuresDeepDive/frameGraph/frameGraphBlocks/frameGraphBlocksDescription"
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -23,14 +23,22 @@ Each task declares its input and output resources.

The resources can be textures, a list of renderable meshes, cameras, lights. As for the textures, they are allocated and managed by the frame graph subsystem and not directly by the tasks. This allows us to optimize the allocation of textures and reuse them during the execution of a graph, thus saving GPU memory.

By default, there is no persistence of resources between each execution of a rendering graph, unless a resource is specifically labeled as “persistent” (think of a texture that must be reused from one frame to the next). In our implementation, persistent textures are used to implement “ping-pong” textures, where we change the read and write textures with each frame (used to implement the temporal antialiasing task, for example).
By default, there is no persistence of resources between each execution of a render graph, unless a resource is specifically labeled as “persistent” (think of a texture that must be reused from one frame to the next). In our implementation, persistent textures are used to implement “ping-pong” textures, where we change the read and write textures with each frame (used to implement the temporal antialiasing task, for example).

To clarify the ideas, here is a simple graph:

![Basic graph](/img/frameGraph/basic_graph.jpg)

The “Color Texture”, “Depth Texture”, “Camera” and “Object List” nodes are input resources (respectively, of the texture, depth texture, camera and object list type). “Clear” and “Main Rendering” are two tasks, the first clears a texture/depth texture and the second renders objects in a texture. “Output” is the output buffer (think of it as the screen output).

As a user, the process of creating and using a frame graph is as follows:
* Create a frame graph, either by using the `FrameGraphXXX` classes (see [Frame Graph Framework Description](/features/featuresDeepDive/frameGraph/frameGraphClassFramework)), or by loading a node render graph (see [Node Render Graph Blocks](/features/featuresDeepDive/frameGraph/frameGraphBlocks)).
* Build the frame graph (`await FrameGraph.buildAsync()` or `await NodeRenderGraph.buildAsync()`). Calling these functions will also wait until the graph is ready to be displayed (it waits until all tasks + all internal states are ready).
<br/>
At this point, the frame graph can be safely executed: call `FrameGraph.execute()`, or simply set `scene.frameGraph = myFrameGraph`, in which case the call to `execute` will be performed by the scene's rendering loop.

Note that in this scenario, you never have to manage passes: you will only need to create passes when you create your own tasks. As a user, you simply use the existing tasks (see [Frame Graph Task List](/features/featuresDeepDive/frameGraph/frameGraphClassFramework/frameGraphTaskList) for the list of existing tasks in the framework), creating an instance of the task, setting its input parameters to reasonable values, and adding the task to the frame graph. See [Introduction to Frame Graph classes](/features/featuresDeepDive/frameGraph/frameGraphClassFramework/frameGraphClassOverview) for more information.

### Benefits

A frame graph allows a high-level knowledge of the whole frame to be acquired, which:
Expand All @@ -39,7 +47,7 @@ A frame graph allows a high-level knowledge of the whole frame to be acquired, w
* simplifies asynchronous computation and resource barriers. This is an advantage for native implementations, but for the web, we don't (yet?) have asynchronous computation and we don't have to manage resource barriers ourselves.
* allows for autonomous and efficient rendering modules
* overcomes some of the limitations of our hard-coded pipeline.

<br/>
The last advantage is particularly important, as it allows things that are not possible in our current fixed pipeline, which can be described as follows (this is what the existing `Scene.render` method does):
1. Animate
1. Update cameras
Expand All @@ -50,7 +58,7 @@ The last advantage is particularly important, as it allows things that are not p
1. Rendering of RTT declared at camera level
1. Rendering of active meshes
1. Apply post-processing to the camera

<br/>
We have a number of observables that allow you to inject code between these stages, and internal components that add functionality at key points (such as shadows, layers, effect layers, etc.), but the order of tasks is always fixed and strongly centered on the camera, as you can see.

With a frame graph, nothing is defined in advance; you simply create tasks and their interconnections. The camera has no particular status; it is a resource that you can use to construct the graph, in the same way that you can use textures or lists of objects.
Expand All @@ -74,7 +82,7 @@ In this mode, the execution of a frame graph largely replaces the flow of operat
* You must define `Scene.cameraToUseForPointers` for the camera to be used for pointer operations. By default, if nothing is defined in this property, `Scene.activeCamera` is used. But since this last property is now `null` most of the time, pointer operations will not work as expected if you do not define `Scene.cameraToUseForPointers`.
* You will not be able to define the parameters of a certain number of components via the inspector (such as rendering pipelines, effect layers, post-processing) because these are now simple tasks within a frame graph. As a workaround, if you use a node render graph to generate the frame graph, you will be able to set parameters in the node render graph editor. We will also look at how to update the inspector (when possible), so that we can adjust the settings from this tool.
* Most of the existing observables notified by `Scene.render` are no longer notified. As explained above, the execution of the frame graph replaces much of `Scene.render` (see below), so a lot of code is no longer executed.

<br/>
Regarding the last point, this is what `Scene.render` does when `Scene.frameGraph` is defined:
1. Notifies `Scene.onBeforeAnimationsObservable`
1. Calls `Scene.animate`
Expand All @@ -84,7 +92,7 @@ Regarding the last point, this is what `Scene.render` does when `Scene.frameGrap
1. Animates the particle systems
1. Executes the frame graph
1. Notifies `Scene.onAfterRenderObservable`

<br/>
As you can see, only 3 observables are notified in this case.

### Use a frame graph in addition to the existing scene render loop
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,17 +22,13 @@ We now create the frame graph and import the post-process texture into it, so th
```javascript
const frameGraph = new BABYLON.FrameGraph(scene, true);

engine.onResizeObservable.add(() => {
frameGraph.build();
});

const passPostProcessHandle = frameGraph.textureManager.importTexture("pass post-process", passPostProcess.inputTexture.texture);

passPostProcess.onSizeChangedObservable.add(() => {
frameGraph.textureManager.importTexture("pass post-process", passPostProcess.inputTexture.texture, passPostProcessHandle);
});
```
Note that when the size of the post-process changes (due to a resizing of the window, for example), the texture will be recreated, and we will therefore have to re-import it into the frame graph. We can pass a texture handle to `importTexture` so that the texture passed as the first parameter replaces the texture associated with this handle. This way, we won't have to update the frame graph, the post-process pass texture always remains associated with `passPostProcessHandle`.
Note that when the size of the post-process changes (due to a resizing of the window, for example), the texture will be recreated, and we will therefore have to re-import it into the frame graph. We can pass a texture handle to `importTexture()` so that the texture passed as the second parameter replaces the texture associated with this handle. This way, we won't have to update the frame graph, the post-process pass texture always remains associated with `passPostProcessHandle`.

The next step is to create the bloom, black and white and copy to back buffer tasks, add them to the frame graph and build the graph:
```javascript
Expand All @@ -48,20 +44,24 @@ const copyToBackbufferTask = new BABYLON.FrameGraphCopyToBackbufferColorTask("co
copyToBackbufferTask.sourceTexture = bnwTask.outputTexture;
frameGraph.addTask(copyToBackbufferTask);

frameGraph.build();
engine.onResizeObservable.add(async () => {
await frameGraph.buildAsync();
});

await frameGraph.whenReadyAsync();
await frameGraph.buildAsync();
```
There is not much to say here, the code should be self-explanatory.

Refer to [Frame Graph Task List](/features/featuresDeepDive/frameGraph/frameGraphClassFramework/frameGraphTaskList) for detailed explanations of the various frame graph tasks used in the code snippet above.

Finally, we need to execute the frame graph at each frame. As we use the regular rendering output of the scene, the best place is inside a `Scene.onAfterRenderObservable` observer:
```javascript
scene.onAfterRenderObservable.add(() => {
frameGraph.execute();
});
```

The full PG: <Playground id="#RM56RY#12" title="Frame Graph basic example" description="Basic frame graph example in addition to the scene render loop (manual use of the frame graph classes)"/>
The full PG: <Playground id="#RM56RY#27" image="/img/playgroundsAndNMEs/pg-RM56RY-12.png" title="Frame Graph basic example" description="Basic frame graph example in addition to the scene render loop (manual use of the frame graph classes)"/>

## Using a node render graph

Expand All @@ -85,25 +85,25 @@ const nrg = await BABYLON.NodeRenderGraph.ParseFromSnippetAsync("#FAPQIH#1", sce

const frameGraph = nrg.frameGraph;

passPostProcess.onSizeChangedObservable.add(() => {
const setExternalTexture = async () => {
nrg.getBlockByName("Texture").value = passPostProcess.inputTexture.texture;
nrg.build();
});
await nrg.buildAsync(false, true, false);
};

nrg.getBlockByName("Texture").value = passPostProcess.inputTexture.texture;

nrg.build();
passPostProcess.onSizeChangedObservable.add(async () => {
await setExternalTexture();
});

await nrg.whenReadyAsync();
await setExternalTexture();

scene.onAfterRenderObservable.add(() => {
frameGraph.execute();
});
```
As above, we create a "pass" post-process, so that the scene is rendered in a texture. This texture is set as the value of the block named “Texture”, which is our input texture in the graph.

Note that we have deactivated the automatic building of the graph when resizing the engine, because when the screen is resized, we must first update the texture of the “Texture” block before rebuilding the graph.
Note that we have deactivated the automatic building of the graph when resizing the engine (parameter **rebuildGraphOnEngineResize** in the call to `ParseFromSnippetAsync()`), because when the screen is resized, we must first update the texture of the “Texture” block before rebuilding the graph: `passPostProcess.onSizeChangedObservable` replaces `engine.onResizeObservable`.

The rest of the code should be simple to understand.
Also note the parameters passed to `nrg.buildAsync(dontBuildFrameGraph = false, waitForReadiness = true, setAsSceneFrameGraph = false)`: the first two parameters have their default values, but the third is set to *false* so that the frame graph is not defined at the scene level!

The full PG: <Playground id="#RM56RY#21" title="Frame Graph basic example" description="Basic frame graph example in addition to the scene render loop (node render graph)"/>
The full PG: <Playground id="#RM56RY#28" image="/img/playgroundsAndNMEs/pg-RM56RY-21.png" title="Frame Graph basic example" description="Basic frame graph example in addition to the scene render loop (node render graph)"/>
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ scene.cameraToUseForPointers = camera;
```
The second parameter of the `FrameGraph` constructor instructs the framework to create debugging textures for the textures created by the frame graph, which you can browse in the inspector. This can help debug / understand what's going on in your code!

We set `scene.frameGraph = frameGraph` so that the frame graph is used instead of the usual scene rendering loop.

Let's create a color and depth texture, which will be used to render the meshes:
```javascript
const colorTexture = frameGraph.textureManager.createRenderTargetTexture("color", {
Expand Down Expand Up @@ -50,6 +52,8 @@ You can see that we create the textures through the `frameGraph.textureManager`

This is why `frameGraph.textureManager.createRenderTargetTexture` returns a texture handle (a number) and not a real texture object: most of the frame graph framework methods that deal with textures use texture handles!

Refer to [Introduction to Frame Graph classes](/features/featuresDeepDive/frameGraph/frameGraphClassFramework/frameGraphClassOverview#texture-handles) for more information about texture handles.

Note that you can import an existing texture into a frame graph by calling `frameGraph.textureManager.importTexture()`, and this method will return a texture handle that you can use as input for frame graph tasks. There is also the inverse method `frameGraph.textureManager.getTextureFromHandle()` which allows you to obtain the real texture object from a texture handle (useful when you use a frame graph at the same time as the scene render loop - see the following section for more information).

An important property of the object passed to the `createRenderTargetTexture` method is `sizeIsPercentage`: if it is `true`, it means that the size values are percentages instead of fixed pixel sizes. These percentages are related to the size of the output screen. If you set `width=height=100`, this means that the texture will be created with the same size as the output screen. If you set these values to `50`, the texture will be created with half the size of the screen. Most of the time, you will want to set `sizeIsPercentage=true` to keep your frame graph independent of the output size.
Expand All @@ -67,6 +71,8 @@ frameGraph.addTask(clearTask);
```
This code creates a "clear" task, configures it and adds it to the frame graph.

Refer to [Frame Graph Task List](/features/featuresDeepDive/frameGraph/frameGraphClassFramework/frameGraphTaskList) for detailed explanations of the various frame graph tasks used in the code snippets on this page.

The main task is the rendering of the meshes:
```javascript
const rlist = {
Expand Down Expand Up @@ -94,22 +100,18 @@ copyToBackbufferTask.sourceTexture = renderTask.outputTexture;

frameGraph.addTask(copyToBackbufferTask);
```
Once all the tasks have been added to the frame graph, you must build the graph by calling `FrameGraph.build()`. This ensures that everything is ready before the graph can be executed (among other things, it allocates the textures).

You can also call `await FrameGraph.whenReadyAsync()` to make sure that all the resources are ready and that the next call to `FrameGraph.execute()` (which is done automatically at the appropriate moment by the framework when `Scene.frameGraph` is defined) will render something and will not be delayed.
Once all tasks have been added to the frame graph, you must build the graph by calling `await FrameGraph.buildAsync()`. This creates the various passes that will be executed when `FrameGraph.execute()` is called and ensures that everything is ready before the graph can be executed (among other things, it allocates textures).

Finally, you must manage the resizing of the screen, so simply call `frameGraph.build()` when the engine resizes:
Finally, don't forget to handle screen resizing:
```javascript
engine.onResizeObservable.add(() => {
frameGraph.build();
engine.onResizeObservable.add(async () => {
await frameGraph.buildAsync();
});

frameGraph.build();

await frameGraph.whenReadyAsync();
await frameGraph.buildAsync();
```

Here's the PG corresponding to this example: <Playground id="#9YU4C5#12" title="Frame Graph basic example" description="Basic frame graph example in replacement of the scene render loop (manual use of the frame graph classes)"/>
Here's the PG corresponding to this example: <Playground id="#6HFJ0J" image="/img/playgroundsAndNMEs/pg-9YU4C5-12.png" title="Frame Graph basic example" description="Basic frame graph example in replacement of the scene render loop (manual use of the frame graph classes)"/>

## Using a node render graph

Expand All @@ -122,15 +124,13 @@ The javascript code:
```javascript
const nrg = await BABYLON.NodeRenderGraph.ParseFromSnippetAsync("#CCDXLX", scene);

nrg.build();

await nrg.whenReadyAsync();

scene.frameGraph = nrg.frameGraph;
await nrg.buildAsync();
```
That's all you need to make it work with a node render graph!

The full PG: <Playground id="#9YU4C5#11" title="Frame Graph basic example" description="Basic frame graph example in replacement of the scene render loop (node render graph)"/>
The full PG: <Playground id="#9YU4C5#113" image="/img/playgroundsAndNMEs/pg-9YU4C5-11.png" title="Frame Graph basic example" description="Basic frame graph example in replacement of the scene render loop (node render graph)"/>

By default, calling `nrg.buildAsync()` will also assign the frame graph to `scene.frameGraph`.

For more complicated examples, you may need to pass a third parameter to `NodeRenderGraph.ParseFromSnippetAsync()` to configure the node render graph:
```javascript
Expand Down
Loading