Skip to content
This repository was archived by the owner on May 20, 2025. It is now read-only.

Commit a176e90

Browse files
HomelessDinosaurdavemooreuws
authored andcommitted
fix min width indentation issues and add a feature image
1 parent 258b35c commit a176e90

File tree

3 files changed

+16
-5
lines changed

3 files changed

+16
-5
lines changed

docs/guides/python/blender-render.mdx

Lines changed: 15 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,11 @@ description: Use the Nitric framework to build a service for rendering Blender s
33
tags:
44
- API
55
- AI & Machine Learning
6+
image: /docs/images/guides/blender-render/featured.png
7+
image_alt: 'AI Podcast Part 1 Banner'
8+
featured:
9+
image: /docs/images/guides/blender-render/featured.png
10+
image_alt: 'AI Podcast Part 1 featured image'
611
languages:
712
- python
813
published_at: 2024-11-07
@@ -11,7 +16,7 @@ updated_at: 2024-11-07
1116

1217
# Use Cloud GPUs for rendering your Blender projects
1318

14-
This example shows how you can create a remote Blender rendering application using Blender's Python interface.
19+
This example shows how you can create a remote [Blender](https://www.blender.org/) rendering application using Blender's Python interface.
1520

1621
By using the cloud you can render your Blender scenes on infrastructure that scales and with CPU or GPU resources you might not have access to locally.
1722

@@ -200,7 +205,7 @@ async def write_render(ctx: HttpContext):
200205

201206
return ctx
202207

203-
Nitric.run()
208+
Nitric.run()
204209
```
205210

206211
We will add a storage listener which will be triggered by files being added to the `blend_bucket`. This is so we can trigger the rendering job when the rendering metadata and the `.blend` file are added to the bucket. By making this start from the listener instead of the API, we can set up workflows where rendering could be triggered from adding files to buckets manually.
@@ -846,13 +851,19 @@ We can test our application locally using:
846851
nitric run
847852
```
848853

849-
You can then use any HTTP client capable of sending binary data with the request, like the Nitric [local dashboard](/get-started/foundations/projects/local-development#local-dashboard).
854+
We can then use any HTTP client capable of sending binary data with the request, like the Nitric [local dashboard](/get-started/foundations/projects/local-development#local-dashboard). Start by making a request using a static `.blend` scene:
850855

851856
```bash
852857
curl --request PUT --data-binary "@cube.blend" http://localhost:4001/cube
853858
```
854859

855-
To render an animation:
860+
We can then use the following request to render an animation. We have modified the render settings by setting
861+
862+
- animate: true
863+
- device: GPU
864+
- engine: CYCLES
865+
- fps: 30
866+
- file_format: FFMPEG
856867

857868
```bash
858869
curl --request PUT --data-binary "@animation.blend" "http://localhost:4001/animation?animate=true&device=GPU&engine=CYCLES&fps=30&file_format=FFMPEG"
20.1 KB
Loading

src/components/code/annotations/collapse.tsx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ export const collapseTrigger: AnnotationHandler = {
5252
const icon = props.data?.icon as React.ReactNode
5353
return (
5454
<div className="table-row">
55-
<span className="table-cell w-5 text-center">{icon}</span>
55+
<span className="table-cell min-w-5 text-center">{icon}</span>
5656
<div className="table-cell">
5757
<InnerLine merge={props} />
5858
</div>

0 commit comments

Comments
 (0)