Skip to content

Commit 087154a

Browse files
authored
Update README.md
1 parent 2aea7bf commit 087154a

File tree

1 file changed

+37
-37
lines changed

1 file changed

+37
-37
lines changed

README.md

Lines changed: 37 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -14,21 +14,23 @@
1414

1515
# Getting Started
1616

17-
Diffusion Studio is a browser-based framework for programmatic video editing. It enables developers to automate complex editing workflows, build AI-powered video editors and create videos at scale.
18-
19-
```sh
20-
npm i @diffusionstudio/core
21-
```
17+
Diffusion Studio is an open-source, browser-based video editing library that allows developers to automate video editing workflows at scale, build custom editing applications, or seamlessly integrate video processing capabilities into existing projects.
2218

2319
## Documentation
2420

2521
Visit https://docs.diffusion.studio to view the full documentation.
2622

2723
## Why Use Diffusion Studio
28-
💻 Fully client-side<br/>
29-
📦 Fully extensible with Pixi.js<br/>
30-
🩸 WebGPU/WebGL API support<br/>
31-
🏎️ WebCodecs API support<br/>
24+
💻 100% **client-side**<br/>
25+
📦 Fully **extensible** with [Pixi.js](https://pixijs.com/)<br/>
26+
🩸 Blazingly **fast** WebGPU/WebGL renderer<br/>
27+
🏎️ **Cutting edge** WebCodecs export<br/>
28+
29+
## Getting Started
30+
31+
```sh
32+
npm install @diffusionstudio/core
33+
```
3234

3335
## Basic Usage
3436
Let's take a look at an example:
@@ -66,7 +68,7 @@ This may look familiar to some. That is because the API is heavily inspired by *
6668

6769
![Composition Visulization](./assets/composition.png)
6870

69-
Whereas each track contains zero or more clips of a single type in ascending chronological order. Clips within a track cannot overlap with other clips, similar to Adobe Premiere etc.
71+
Whereas each track contains zero or more clips of a single type in ascending chronological order.
7072

7173
A track will be created implicitly with `composition.add(clip)` however you can also create them manually like this:
7274

@@ -85,13 +87,11 @@ https://github.com/user-attachments/assets/7a943407-e916-4d9f-b46a-3163dbff44c3
8587

8688
## How does Diffusion Studio compare to Remotion and Motion Canvas?
8789

88-
**Remotion** acts as a React-based video creation tool, enabling you to render the entire DOM tree as well as the full suite of browser visualization features, such as HTML, CSS, Canvas, etc.. This makes Remotion ideal for beginners looking to create videos with code. However, it is limited to react and relies heavily on the CPU, which can be less efficient compared to GPU backed rendering.
89-
90-
In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It is intended as a standalone editor for creating production-quality animations. In addition, Motion Canvas uses an imperative API. Instead of rendering markup based on timestamps, elements are procedurally added to the timeline. This approach is perfect for creating animations with code (the intended purpose). However, it usually demands static workflows with little variability, making it difficult to build dynamic applications.
90+
**Remotion** is a React-based video creation tool that transforms the entire DOM into videos. It's particularly suited for beginners, as web developers can start creating videos using the skills they already have.
9191

92-
**Diffusion Studio** combines the strengths of both Remotion and Motion Canvas by offering a declarative (yet framework-agnostic) API like Remotion, while also being GPU-backed like Motion Canvas. Diffusion Studio is optimized for video processing performance, utilizing the latest and greatest technologies (WebGPU and WebCodecs). Its API is specifically designed for building video editing apps and to automate complex video workflows.
92+
**Motion Canvas** uses a Canvas 2D implementation for rendering. It is intended as a standalone editor for creating production-quality animations. It features a unique imperative API that adds elements to the timeline procedurally, rather than relying on keyframes like traditional video editing tools. This makes Motion Canvas ideal for crafting detailed, animated videos.
9393

94-
**Note: Diffusion Studio eliminates the need to pay for rendering server infrastructure, since all processing is performed client-side!**
94+
In contrast, **Diffusion Studio** is not a framework with a visual editing interface but a video editing library that can be integrated into existing projects. It operates entirely on the client-side, eliminating the need for additional backend infrastructure. Diffusion Studio is also dedicated to supporting the latest rendering technologies, including WebGPU, WebGL, and WebCodecs. If a feature you need isn't available, you can easily extend it using [Pixi.js](https://github.com/pixijs/pixijs).
9595

9696
## Current features
9797
* **Video/Audio** trim and offset
@@ -100,6 +100,7 @@ In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It
100100
* **Html & Image** rendering
101101
* **Text** with multiple styles
102102
* Web & Local **Fonts**
103+
* **Custom Clips** based on Pixi.js
103104
* **Filters**
104105
* **Keyframe** animations
105106
* **Numbers, Degrees and Colors**
@@ -109,6 +110,27 @@ In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It
109110
* **Hardware accelerated** encoding via WebCodecs
110111
* **Dynamic render resolution and framerate**
111112

113+
## Contributing
114+
Contributions to Diffusion Studio are welcome and highly appreciated. Simply fork this respository and run:
115+
116+
```sh
117+
npm install
118+
```
119+
120+
Before checking in a pull request please verify that all unit tests are still green by running:
121+
122+
```sh
123+
npm run test
124+
```
125+
126+
## Background
127+
128+
This project began in March 2023 with the mission of creating the "video processing toolkit for the era of AI." As someone passionate about video editing for over a decade, I saw Chrome’s release of Webcodecs and WebGPU without a feature flag as the perfect moment to build something new.
129+
130+
Currently, most browser-based video editors rely on server-side rendering, requiring time-consuming uploads and downloads of large video files. With Webcodecs, video processing can now be handled directly in the browser, making it faster and more efficient.
131+
132+
I’m excited to be part of the next generation of video editing technology.
133+
112134
## Compatability
113135

114136
✅ Supported
@@ -159,25 +181,3 @@ In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It
159181
| Mp3 |||
160182
| Ogg |||
161183
| Wav || N/A |
162-
163-
## Contributing
164-
Contributions to Diffusion Studio are welcome and highly appreciated. Simply fork this respository and run:
165-
166-
```sh
167-
npm install
168-
```
169-
170-
Before checking in a pull request please verify that all unit tests are still green by running:
171-
172-
```sh
173-
npm run test
174-
```
175-
176-
## Background
177-
178-
This project has been started in March 2023 with the mission of creating the *"video processing toolkit for the area of AI"*. During an extensive research period, we quickly decided to fully embrace **WebGPU**, which offers a substantial performance improvement over its predecessor WebGL and technologies alike. The following implementations were evaluated:
179-
* **C++ w/ Python bindings** - inefficient to develop.
180-
* **Rust** - early ecosystem (might come back here).
181-
* **Typescript** - efficient to develop, great performance when gpu based.
182-
183-
They all support WebGPU, however, in the case of Typescript, WebGPU is currently only available in Chromium-based browsers, which is why a WebGL fallback is mandatory.

0 commit comments

Comments
 (0)