You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+37-37Lines changed: 37 additions & 37 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,21 +14,23 @@
14
14
15
15
# Getting Started
16
16
17
-
Diffusion Studio is a browser-based framework for programmatic video editing. It enables developers to automate complex editing workflows, build AI-powered video editors and create videos at scale.
18
-
19
-
```sh
20
-
npm i @diffusionstudio/core
21
-
```
17
+
Diffusion Studio is an open-source, browser-based video editing library that allows developers to automate video editing workflows at scale, build custom editing applications, or seamlessly integrate video processing capabilities into existing projects.
22
18
23
19
## Documentation
24
20
25
21
Visit https://docs.diffusion.studio to view the full documentation.
26
22
27
23
## Why Use Diffusion Studio
28
-
💻 Fully client-side<br/>
29
-
📦 Fully extensible with Pixi.js<br/>
30
-
🩸 WebGPU/WebGL API support<br/>
31
-
🏎️ WebCodecs API support<br/>
24
+
💻 100% **client-side**<br/>
25
+
📦 Fully **extensible** with [Pixi.js](https://pixijs.com/)<br/>
26
+
🩸 Blazingly **fast** WebGPU/WebGL renderer<br/>
27
+
🏎️ **Cutting edge** WebCodecs export<br/>
28
+
29
+
## Getting Started
30
+
31
+
```sh
32
+
npm install @diffusionstudio/core
33
+
```
32
34
33
35
## Basic Usage
34
36
Let's take a look at an example:
@@ -66,7 +68,7 @@ This may look familiar to some. That is because the API is heavily inspired by *
Whereas each track contains zero or more clips of a single type in ascending chronological order. Clips within a track cannot overlap with other clips, similar to Adobe Premiere etc.
71
+
Whereas each track contains zero or more clips of a single type in ascending chronological order.
70
72
71
73
A track will be created implicitly with `composition.add(clip)` however you can also create them manually like this:
## How does Diffusion Studio compare to Remotion and Motion Canvas?
87
89
88
-
**Remotion** acts as a React-based video creation tool, enabling you to render the entire DOM tree as well as the full suite of browser visualization features, such as HTML, CSS, Canvas, etc.. This makes Remotion ideal for beginners looking to create videos with code. However, it is limited to react and relies heavily on the CPU, which can be less efficient compared to GPU backed rendering.
89
-
90
-
In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It is intended as a standalone editor for creating production-quality animations. In addition, Motion Canvas uses an imperative API. Instead of rendering markup based on timestamps, elements are procedurally added to the timeline. This approach is perfect for creating animations with code (the intended purpose). However, it usually demands static workflows with little variability, making it difficult to build dynamic applications.
90
+
**Remotion** is a React-based video creation tool that transforms the entire DOM into videos. It's particularly suited for beginners, as web developers can start creating videos using the skills they already have.
91
91
92
-
**Diffusion Studio**combines the strengths of both Remotion and Motion Canvas by offering a declarative (yet framework-agnostic) API like Remotion, while also being GPU-backed like Motion Canvas. Diffusion Studio is optimized for video processing performance, utilizing the latest and greatest technologies (WebGPU and WebCodecs). Its API is specifically designed for building video editing apps and to automate complex video workflows.
92
+
**Motion Canvas**uses a Canvas 2D implementation for rendering. It is intended as a standalone editor for creating production-quality animations. It features a unique imperative API that adds elements to the timeline procedurally, rather than relying on keyframes like traditional video editing tools. This makes Motion Canvas ideal for crafting detailed, animated videos.
93
93
94
-
**Note: Diffusion Studio eliminates the need to pay for rendering server infrastructure, since all processing is performed client-side!**
94
+
In contrast, **Diffusion Studio** is not a framework with a visual editing interface but a video editing library that can be integrated into existing projects. It operates entirely on the client-side, eliminating the need for additional backend infrastructure. Diffusion Studio is also dedicated to supporting the latest rendering technologies, including WebGPU, WebGL, and WebCodecs. If a feature you need isn't available, you can easily extend it using [Pixi.js](https://github.com/pixijs/pixijs).
95
95
96
96
## Current features
97
97
***Video/Audio** trim and offset
@@ -100,6 +100,7 @@ In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It
100
100
***Html & Image** rendering
101
101
***Text** with multiple styles
102
102
* Web & Local **Fonts**
103
+
***Custom Clips** based on Pixi.js
103
104
***Filters**
104
105
***Keyframe** animations
105
106
***Numbers, Degrees and Colors**
@@ -109,6 +110,27 @@ In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It
109
110
***Hardware accelerated** encoding via WebCodecs
110
111
***Dynamic render resolution and framerate**
111
112
113
+
## Contributing
114
+
Contributions to Diffusion Studio are welcome and highly appreciated. Simply fork this respository and run:
115
+
116
+
```sh
117
+
npm install
118
+
```
119
+
120
+
Before checking in a pull request please verify that all unit tests are still green by running:
121
+
122
+
```sh
123
+
npm run test
124
+
```
125
+
126
+
## Background
127
+
128
+
This project began in March 2023 with the mission of creating the "video processing toolkit for the era of AI." As someone passionate about video editing for over a decade, I saw Chrome’s release of Webcodecs and WebGPU without a feature flag as the perfect moment to build something new.
129
+
130
+
Currently, most browser-based video editors rely on server-side rendering, requiring time-consuming uploads and downloads of large video files. With Webcodecs, video processing can now be handled directly in the browser, making it faster and more efficient.
131
+
132
+
I’m excited to be part of the next generation of video editing technology.
133
+
112
134
## Compatability
113
135
114
136
✅ Supported
@@ -159,25 +181,3 @@ In contrast, **Motion Canvas** uses a Canvas 2D implementation for rendering. It
159
181
| Mp3 | ✅ | ❌ |
160
182
| Ogg | ✅ | ❌ |
161
183
| Wav | ✅ | N/A |
162
-
163
-
## Contributing
164
-
Contributions to Diffusion Studio are welcome and highly appreciated. Simply fork this respository and run:
165
-
166
-
```sh
167
-
npm install
168
-
```
169
-
170
-
Before checking in a pull request please verify that all unit tests are still green by running:
171
-
172
-
```sh
173
-
npm run test
174
-
```
175
-
176
-
## Background
177
-
178
-
This project has been started in March 2023 with the mission of creating the *"video processing toolkit for the area of AI"*. During an extensive research period, we quickly decided to fully embrace **WebGPU**, which offers a substantial performance improvement over its predecessor WebGL and technologies alike. The following implementations were evaluated:
179
-
***C++ w/ Python bindings** - inefficient to develop.
180
-
***Rust** - early ecosystem (might come back here).
181
-
***Typescript** - efficient to develop, great performance when gpu based.
182
-
183
-
They all support WebGPU, however, in the case of Typescript, WebGPU is currently only available in Chromium-based browsers, which is why a WebGL fallback is mandatory.
0 commit comments