Skip to content

Commit d4a32a4

Browse files
authored
Merge branch 'main' into add-vslice
2 parents 4acd791 + 1768b8b commit d4a32a4

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

60 files changed

+7951
-4467
lines changed

.github/FUNDING.yml

Lines changed: 0 additions & 3 deletions
This file was deleted.

.github/workflows/documentation.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,6 @@ jobs:
1010
build:
1111
uses: huggingface/doc-builder/.github/workflows/build_main_documentation.yml@main
1212
with:
13-
repo_owner: xenova
1413
commit_sha: ${{ github.sha }}
1514
package: transformers.js
1615
path_to_docs: transformers.js/docs/source

.github/workflows/pr-documentation.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,6 @@ jobs:
1111
build:
1212
uses: huggingface/doc-builder/.github/workflows/build_pr_documentation.yml@main
1313
with:
14-
repo_owner: xenova
1514
commit_sha: ${{ github.sha }}
1615
pr_number: ${{ github.event.number }}
1716
package: transformers.js

.github/workflows/tests.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ jobs:
2020

2121
strategy:
2222
matrix:
23-
node-version: [18.x, latest, node]
23+
node-version: [18, 20, 22]
2424

2525
steps:
2626
- uses: actions/checkout@v4

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,7 @@ __pycache__
22
.vscode
33
node_modules
44
.cache
5+
.DS_STORE
56

67
# Do not track build artifacts/generated files
78
/dist

README.md

Lines changed: 74 additions & 48 deletions
Large diffs are not rendered by default.

docs/scripts/build_readme.py

Lines changed: 10 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -13,33 +13,23 @@
1313
</p>
1414
1515
<p align="center">
16-
<a href="https://www.npmjs.com/package/@huggingface/transformers">
17-
<img alt="NPM" src="https://img.shields.io/npm/v/@huggingface/transformers">
18-
</a>
19-
<a href="https://www.npmjs.com/package/@huggingface/transformers">
20-
<img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@huggingface/transformers">
21-
</a>
22-
<a href="https://www.jsdelivr.com/package/npm/@huggingface/transformers">
23-
<img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@huggingface/transformers">
24-
</a>
25-
<a href="https://github.com/xenova/transformers.js/blob/main/LICENSE">
26-
<img alt="License" src="https://img.shields.io/github/license/xenova/transformers.js?color=blue">
27-
</a>
28-
<a href="https://huggingface.co/docs/transformers.js/index">
29-
<img alt="Documentation" src="https://img.shields.io/website/http/huggingface.co/docs/transformers.js/index.svg?down_color=red&down_message=offline&up_message=online">
30-
</a>
16+
<a href="https://www.npmjs.com/package/@huggingface/transformers"><img alt="NPM" src="https://img.shields.io/npm/v/@huggingface/transformers"></a>
17+
<a href="https://www.npmjs.com/package/@huggingface/transformers"><img alt="NPM Downloads" src="https://img.shields.io/npm/dw/@huggingface/transformers"></a>
18+
<a href="https://www.jsdelivr.com/package/npm/@huggingface/transformers"><img alt="jsDelivr Hits" src="https://img.shields.io/jsdelivr/npm/hw/@huggingface/transformers"></a>
19+
<a href="https://github.com/huggingface/transformers.js/blob/main/LICENSE"><img alt="License" src="https://img.shields.io/github/license/huggingface/transformers.js?color=blue"></a>
20+
<a href="https://huggingface.co/docs/transformers.js/index"><img alt="Documentation" src="https://img.shields.io/website/http/huggingface.co/docs/transformers.js/index.svg?down_color=red&down_message=offline&up_message=online"></a>
3121
</p>
3222
3323
{intro}
3424
35-
## Quick tour
36-
37-
{quick_tour}
38-
3925
## Installation
4026
4127
{installation}
4228
29+
## Quick tour
30+
31+
{quick_tour}
32+
4333
## Examples
4434
4535
{examples}
@@ -52,7 +42,7 @@
5242
5343
Here is the list of all tasks and architectures currently supported by Transformers.js.
5444
If you don't see your task/model listed here or it is not yet supported, feel free
55-
to open up a feature request [here](https://github.com/xenova/transformers.js/issues/new/choose).
45+
to open up a feature request [here](https://github.com/huggingface/transformers.js/issues/new/choose).
5646
5747
To find compatible models on the Hub, select the "transformers.js" library tag in the filter menu (or visit [this link](https://huggingface.co/models?library=transformers.js)).
5848
You can refine your search by selecting the task you're interested in (e.g., [text-classification](https://huggingface.co/models?pipeline_tag=text-classification&library=transformers.js)).

docs/snippets/0_introduction.snippet

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,9 @@
11

2-
State-of-the-art Machine Learning for the web. Run 🤗 Transformers directly in your browser, with no need for a server!
2+
<h3 align="center">
3+
<p>State-of-the-art Machine Learning for the Web</p>
4+
</h3>
5+
6+
Run 🤗 Transformers directly in your browser, with no need for a server!
37

48
Transformers.js is designed to be functionally equivalent to Hugging Face's [transformers](https://github.com/huggingface/transformers) python library, meaning you can run the same pretrained models using a very similar API. These models support common tasks in different modalities, such as:
59
- 📝 **Natural Language Processing**: text classification, named entity recognition, question answering, language modeling, summarization, translation, multiple choice, and text generation.

docs/snippets/1_quick-tour.snippet

Lines changed: 30 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -26,9 +26,9 @@ out = pipe('I love transformers!')
2626
import { pipeline } from '@huggingface/transformers';
2727
2828
// Allocate a pipeline for sentiment-analysis
29-
let pipe = await pipeline('sentiment-analysis');
29+
const pipe = await pipeline('sentiment-analysis');
3030
31-
let out = await pipe('I love transformers!');
31+
const out = await pipe('I love transformers!');
3232
// [{'label': 'POSITIVE', 'score': 0.999817686}]
3333
```
3434

@@ -40,5 +40,32 @@ let out = await pipe('I love transformers!');
4040
You can also use a different model by specifying the model id or path as the second argument to the `pipeline` function. For example:
4141
```javascript
4242
// Use a different model for sentiment-analysis
43-
let pipe = await pipeline('sentiment-analysis', 'Xenova/bert-base-multilingual-uncased-sentiment');
43+
const pipe = await pipeline('sentiment-analysis', 'Xenova/bert-base-multilingual-uncased-sentiment');
44+
```
45+
46+
By default, when running in the browser, the model will be run on your CPU (via WASM). If you would like
47+
to run the model on your GPU (via WebGPU), you can do this by setting `device: 'webgpu'`, for example:
48+
```javascript
49+
// Run the model on WebGPU
50+
const pipe = await pipeline('sentiment-analysis', 'Xenova/distilbert-base-uncased-finetuned-sst-2-english', {
51+
device: 'webgpu',
52+
});
53+
```
54+
55+
For more information, check out the [WebGPU guide](/guides/webgpu).
56+
57+
> [!WARNING]
58+
> The WebGPU API is still experimental in many browsers, so if you run into any issues,
59+
> please file a [bug report](https://github.com/huggingface/transformers.js/issues/new?title=%5BWebGPU%5D%20Error%20running%20MODEL_ID_GOES_HERE&assignees=&labels=bug,webgpu&projects=&template=1_bug-report.yml).
60+
61+
In resource-constrained environments, such as web browsers, it is advisable to use a quantized version of
62+
the model to lower bandwidth and optimize performance. This can be achieved by adjusting the `dtype` option,
63+
which allows you to select the appropriate data type for your model. While the available options may vary
64+
depending on the specific model, typical choices include `"fp32"` (default for WebGPU), `"fp16"`, `"q8"`
65+
(default for WASM), and `"q4"`. For more information, check out the [quantization guide](../guides/dtypes).
66+
```javascript
67+
// Run the model at 4-bit quantization
68+
const pipe = await pipeline('sentiment-analysis', 'Xenova/distilbert-base-uncased-finetuned-sst-2-english', {
69+
dtype: 'q4',
70+
});
4471
```

docs/snippets/2_installation.snippet

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,6 @@ npm i @huggingface/transformers
77
Alternatively, you can use it in vanilla JS, without any bundler, by using a CDN or static hosting. For example, using [ES Modules](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules), you can import the library with:
88
```html
99
<script type="module">
10-
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/[email protected].0-alpha.20';
10+
import { pipeline } from 'https://cdn.jsdelivr.net/npm/@huggingface/[email protected].2';
1111
</script>
1212
```

0 commit comments

Comments
 (0)