Skip to content

Commit 1cf82e5

Browse files
committed
update readme
1 parent bbe1658 commit 1cf82e5

File tree

3 files changed

+32
-59
lines changed

3 files changed

+32
-59
lines changed

README.md

Lines changed: 10 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -13,31 +13,17 @@
1313
[![Doc](https://img.shields.io/badge/docs-English-99cc2)](https://llmc-en.readthedocs.io/en/latest/)
1414
[![Doc](https://img.shields.io/badge/文档-中文-99cc2)](https://llmc-zhcn.readthedocs.io/en/latest/)
1515

16-
</div>
17-
1816
**\[ English | [中文](README_zh.md) | [日本語](README_ja.md) \]**
1917

20-
**LLMC** is an off-the-shell tool designed for compressing LLM, leveraging state-of-the-art compression algorithms to enhance efficiency and reduce model size without compromising performance.
21-
22-
**English doc** is [here](https://llmc-en.readthedocs.io/en/latest/).
23-
24-
**Chinese doc** is [here](https://llmc-zhcn.readthedocs.io/en/latest/).
25-
26-
**Docker hub** is [here](https://hub.docker.com/r/llmcompression/llmc).
27-
28-
**Aliyun docker**: `registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:[tag]`
29-
30-
You can download the Docker image that can run llmc with the following command. Users in mainland China are recommended to use Alibaba Cloud Docker.
18+
</div>
3119

32-
docker hub
20+
**LLMC** is an off-the-shell tool designed for compressing LLM, leveraging state-of-the-art compression algorithms to enhance efficiency and reduce model size without compromising performance. You can download the Docker image that can run llmc with the following command. Users in mainland China are recommended to use Alibaba Cloud Docker.
3321

34-
```
22+
```shell
23+
# docker hub: https://hub.docker.com/r/llmcompression/llmc
3524
docker pull llmcompression/llmc:pure-latest
36-
```
37-
38-
aliyun docker
3925

40-
```
26+
# aliyun docker: registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:[tag]
4127
docker pull registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:pure-latest
4228
```
4329

@@ -46,6 +32,11 @@ docker pull registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:pure-lates
4632
- [Discord Server](https://discord.com/invite/NfJzbkK3jY)
4733
- [Tencent QQ Group](http://qm.qq.com/cgi-bin/qm/qr?_wv=1027&k=I9IGPWWj8uuRXWH3_ELWjouf6gkIMgUl&authKey=GA3WbFAsm90ePJf%2FCbc7ZyXXq4ShQktlBaLxgqS5yuSPAsr3%2BDKMRdosUiLYoilO&noverify=0&group_code=526192592)
4834

35+
**Docs**:
36+
37+
- [English](https://llmc-en.readthedocs.io/en/latest/)
38+
- [Chinese](https://llmc-zhcn.readthedocs.io/en/latest/)
39+
4940
## Latest News
5041

5142
- **May 12, 2025:** 🔥 We now fully support quantization for the **`Wan2.1`** series of video generation models and provide export of truly quantized **INT8/FP8** weights, compatible with the [lightx2v](https://github.com/ModelTC/lightx2v) inference framework. For details, please refer to the [lightx2v documentation](https://llmc-en.readthedocs.io/en/latest/backend/lightx2v.html).

README_ja.md

Lines changed: 11 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -13,31 +13,17 @@
1313
[![Doc](https://img.shields.io/badge/docs-English-99cc2)](https://llmc-en.readthedocs.io/en/latest/)
1414
[![Doc](https://img.shields.io/badge/文档-中文-99cc2)](https://llmc-zhcn.readthedocs.io/en/latest/)
1515

16-
</div>
17-
18-
**\[ English | [中文](README_zh.md) | [日本語](README_ja.md) \]**
19-
20-
**LLMC** は、大規模言語モデル(LLM)の圧縮を目的とした、最新の圧縮アルゴリズムを活用して、パフォーマンスを損なうことなく効率を向上させ、モデルサイズを削減するためのツールです。
21-
22-
**英語のドキュメント**[こちら](https://llmc-en.readthedocs.io/en/latest/)
23-
24-
**中国語のドキュメント**[こちら](https://llmc-zhcn.readthedocs.io/en/latest/)
25-
26-
**Docker Hub**[こちら](https://hub.docker.com/r/llmcompression/llmc)
27-
28-
**aliyun docker**: `registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:[tag]`
16+
**\[ [English](README.md) | [中文](README_zh.md) | 日本語 \]**
2917

30-
以下のコマンドを使用して、llmcを実行できるDockerイメージをダウンロードできます。中国大陸のユーザーは、阿里云Dockerを使用することを推奨します。
18+
</div>
3119

32-
docker hub
20+
**LLMC** は、大規模言語モデル(LLM)の圧縮を目的とした、最新の圧縮アルゴリズムを活用して、パフォーマンスを損なうことなく効率を向上させ、モデルサイズを削減するためのツールです。以下のコマンドを使用して、llmcを実行できるDockerイメージをダウンロードできます。中国大陸のユーザーは、阿里云Dockerを使用することを推奨します。
3321

34-
```
22+
```shell
23+
# docker hub: https://hub.docker.com/r/llmcompression/llmc
3524
docker pull llmcompression/llmc:pure-latest
36-
```
37-
38-
阿里云Docker
3925

40-
```
26+
# 阿里云Docker: registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:[tag]
4127
docker pull registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:pure-latest
4228
```
4329

@@ -46,6 +32,11 @@ docker pull registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:pure-lates
4632
- [Discordサーバー](https://discord.com/invite/NfJzbkK3jY)
4733
- [Tencent QQグループ](http://qm.qq.com/cgi-bin/qm/qr?_wv=1027&k=I9IGPWWj8uuRXWH3_ELWjouf6gkIMgUl&authKey=GA3WbFAsm90ePJf%2FCbc7ZyXXq4ShQktlBaLxgqS5yuSPAsr3%2BDKMRdosUiLYoilO&noverify=0&group_code=526192592)
4834

35+
**Docs**:
36+
37+
- [英語](https://llmc-en.readthedocs.io/en/latest/)
38+
- [中国語](https://llmc-zhcn.readthedocs.io/en/latest/)
39+
4940
## 最新情報
5041

5142
- **2025年5月12日:** 🔥 **`Wan2.1`** シリーズのビデオ生成モデルの量子化を完全にサポートし、実際に量子化された **INT8/FP8** 重みのエクスポートにも対応しました。これらは [lightx2v](https://github.com/ModelTC/lightx2v) 推論フレームワークと互換性があります。詳細は [lightx2v ドキュメント](https://llmc-en.readthedocs.io/en/latest/backend/lightx2v.html) をご参照ください。

README_zh.md

Lines changed: 11 additions & 20 deletions
Original file line numberDiff line numberDiff line change
@@ -13,31 +13,17 @@
1313
[![Doc](https://img.shields.io/badge/docs-English-99cc2)](https://llmc-en.readthedocs.io/en/latest/)
1414
[![Doc](https://img.shields.io/badge/文档-中文-99cc2)](https://llmc-zhcn.readthedocs.io/en/latest/)
1515

16-
</div>
17-
18-
**\[ English | [中文](README_zh.md) | [日本語](README_ja.md) \]**
19-
20-
**LLMC** 是一个开箱即用的工具,专为压缩LLM设计,利用最先进的压缩算法提高效率并减少模型体积,同时不影响预测精度。
21-
22-
**英文文档**[此处](https://llmc-en.readthedocs.io/en/latest/)
23-
24-
**中文文档**[此处](https://llmc-zhcn.readthedocs.io/en/latest/)
25-
26-
**Docker hub**[此处](https://hub.docker.com/r/llmcompression/llmc)
27-
28-
**阿里云docker**: `registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:[tag]`
16+
**\[ [English](README.md) | 中文 | [日本語](README_ja.md) \]**
2917

30-
你可以通过以下命令下载可以运行llmc的docker镜像,中国大陆用户推荐使用阿里云docker。
18+
</div>
3119

32-
docker hub
20+
**LLMC** 是一个开箱即用的工具,专为压缩LLM设计,利用最先进的压缩算法提高效率并减少模型体积,同时不影响预测精度。你可以通过以下命令下载可以运行llmc的docker镜像,中国大陆用户推荐使用阿里云docker。
3321

34-
```
22+
```shell
23+
# docker hub: https://hub.docker.com/r/llmcompression/llmc
3524
docker pull llmcompression/llmc:pure-latest
36-
```
37-
38-
阿里云docker
3925

40-
```
26+
# 阿里云docker: registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:[tag]
4127
docker pull registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:pure-latest
4228
```
4329

@@ -46,6 +32,11 @@ docker pull registry.cn-hangzhou.aliyuncs.com/yongyang/llmcompression:pure-lates
4632
- [Discord 服务器](https://discord.com/invite/NfJzbkK3jY)
4733
- [腾讯QQ群](http://qm.qq.com/cgi-bin/qm/qr?_wv=1027&k=I9IGPWWj8uuRXWH3_ELWjouf6gkIMgUl&authKey=GA3WbFAsm90ePJf%2FCbc7ZyXXq4ShQktlBaLxgqS5yuSPAsr3%2BDKMRdosUiLYoilO&noverify=0&group_code=526192592)
4834

35+
**文档**:
36+
37+
- [英文](https://llmc-en.readthedocs.io/en/latest/)
38+
- [中文](https://llmc-zhcn.readthedocs.io/en/latest/)
39+
4940
## 最新消息
5041

5142
- **2025年5月12日:** 🔥 我们现已全面支持 **`Wan2.1`** 系列视频生成模型的量化,并支持导出真实量化的 **INT8/FP8** 权重,兼容 [lightx2v](https://github.com/ModelTC/lightx2v) 推理框架。详情请参考 [lightx2v 使用文档](https://llmc-zhcn.readthedocs.io/en/latest/backend/lightx2v.html)

0 commit comments

Comments
 (0)