Skip to content

Commit 41698fa

Browse files
authored
[MoE] A Survey on Mixture of Experts(@hku)
1 parent 20aba1d commit 41698fa

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -324,7 +324,7 @@ Awesome-LLM-Inference: A curated list of [📙Awesome LLM Inference Papers with
324324
|2024.01| [MoE-Mamba] MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts(@uw.edu.pl) | [[pdf]](https://arxiv.org/pdf/2401.04081.pdf)| ⚠️ |⭐️|
325325
|2024.04| [MoE Inference] Toward Inference-optimal Mixture-of-Expert Large Language Models(@UC San Diego etc)| [[pdf]](https://arxiv.org/pdf/2404.02852.pdf)| ⚠️ |⭐️|
326326
|2024.05| 🔥🔥🔥[DeepSeek-V2] DeepSeek-V2: A Strong, Economical, and Efficient Mixture-of-Experts Language Model(@DeepSeek-AI)|[[pdf]](https://arxiv.org/pdf/2405.04434) | [[DeepSeek-V2]](https://github.com/deepseek-ai/DeepSeek-V2) ![](https://img.shields.io/github/stars/deepseek-ai/DeepSeek-V2.svg?style=social)| ⭐️⭐️ |
327-
327+
|2024.06| [MoE] A Survey on Mixture of Experts(@HKU) | [[pdf]](https://arxiv.org/pdf/2407.06204)| ⚠️ |⭐️|
328328

329329

330330
### 📖CPU/Single GPU/FPGA/Mobile Inference ([©️back👆🏻](#paperlist))

0 commit comments

Comments
 (0)