Skip to content

Add RWKV7 #2457

@pass-lin

Description

@pass-lin

The RWKV family of LLMs is currently one of the most popular pure-RNN large language models. Compared with mainstream Transformer-based models such as Qwen and LLaMA, RWKV offers constant—and significantly lower—computational and memory costs, making it especially suitable for on-device deployment.

From February 26, 2025 to today, RWKV-LM has gained 1,000 GitHub stars (from 13.1 k to 14.1 k), and it added 100 of those stars between October 16 and today. You can visualize this growth trend at Star History.

BTW, the Linux Foundation also highlights RWKV here, and GitHub now hosts more than 600 ecosystem projects around RWKV: search results.

RWKV has already been adopted by Microsoft Office as an on-device solution (you can find rwkv.cpp binaries shipped with Office) and by China’s State Grid Corporation and China Mobile—one of the country’s three major telecom carriers—as their edge-side AI engine.

Among all RWKV variants, RWKV-7 is the most advanced and best-performing model. We hope to integrate it into Keras Hub to make it even more researcher-friendly.

Metadata

Metadata

Labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions