RWKV
RWKV (Receptance Weighted Key Value, pronounced "RwaKuv") is an open-source artificial intelligence research project led by Bo Peng (also known by the pseudonym BlinkDL), developing a distinctive non-attention architecture for large foundation models that combines characteristics of recurrent neural networks (RNNs) and Transformers. The project is structured as an open-research collaboration with volunteer-contributor and Linux Foundation participation, with RWKV Foundation coordination and the Linux Foundation as a open-source-project hosting partner. As of April 2026, RWKV is one of the principal alternative-architecture open-research foundation-model projects, with open-weights release through Hugging Face and cross-institution research-cooperation including Chinese academic and industry partners.
At a glance
- Founded: Initial RWKV architecture proposed 2021 by Bo Peng. RWKV Foundation established 2023.
- Status: Open-research project. RWKV Foundation as principal coordination body. Linux Foundation hosting partner.
- Funding: Volunteer-contributor and donor support. Selected commercial partners and Chinese strategic investor support through adjacent commercial entities.
- Lead: Bo Peng (BlinkDL), Project Lead.
- Other notable leadership: Senior research and engineering volunteers across the broader RWKV community.
- Open weights: Yes. RWKV foundation models are released open-weights through Hugging Face.
- Flagship outputs: RWKV-4, RWKV-5, RWKV-6 (Eagle), RWKV-7 (Goose) foundation-model variants; published research output on the RWKV architecture; the RWKV-LM training infrastructure.
Origins
RWKV was initially proposed in 2021 by Bo Peng (BlinkDL) as a distinctive non-attention architecture combining characteristics of recurrent neural networks and Transformers, with emphasis on parallel-trainable RNN-equivalent capability. The 2022 to 2023 emergence of open-research interest in alternative non-attention architectures (Mamba, RetNet, RWKV, and other architectures) anchored RWKV research community growth.
The 2023 founding of the RWKV Foundation as the principal coordination body, alongside Linux Foundation hosting partnership, anchored sustainability. The 2023 to 2026 period has continued the RWKV variant iteration through RWKV-4, RWKV-5, RWKV-6 (Eagle), and RWKV-7 (Goose), with open-weights release.
The cross-institution research-cooperation including Chinese academic and industry partners has anchored Chinese-language foundation-model variant development through 2024 to 2026.
Mission and strategy
RWKV's mission is to advance the RWKV non-attention foundation-model architecture as an open-research alternative to attention-based architectures. The strategy combines two threads. First, RWKV variant iteration with open-weights release. Second, open-research community-coordination through the RWKV Foundation and Linux Foundation hosting.
Distribution channels include open-weights distribution through Hugging Face, the RWKV-LM training infrastructure on GitHub, and cross-institution research-cooperation.
Models and products
- RWKV foundation-model variants. RWKV-4, RWKV-5, RWKV-6 (Eagle), RWKV-7 (Goose). Open-weights through Hugging Face.
- RWKV-LM training infrastructure. Open-source training framework on GitHub.
- Published research output. On the RWKV architecture and alternative non-attention foundation-model approaches.
Distribution channels include open-weights distribution and cross-institution research-cooperation.
Benchmarks and standing
RWKV's evaluation framework focuses on open-research output, alternative-architecture benchmark performance, and open-weights distribution metrics. The RWKV architecture has been characterized in alternative-architecture industry coverage as one of the principal non-attention foundation-model architectures alongside Mamba, RetNet, and other alternative-architecture research lines.
Leadership
As of April 2026, RWKV's project leadership includes:
- Bo Peng (BlinkDL), Project Lead.
- Senior research and engineering volunteers across the broader RWKV community.
Funding and backers
Volunteer-contributor and donor support through the RWKV Foundation. Selected commercial partners and Chinese strategic investor support through adjacent commercial entities.
Industry position
RWKV occupies a distinctive position as one of the principal alternative-architecture open-research foundation-model projects, with the RWKV non-attention architecture, the RWKV Foundation coordination, the Linux Foundation hosting partnership, and open-weights release.
Competitive landscape
- Mamba (Tri Dao, Albert Gu). Alternative-architecture peer with similar non-attention focus.
- Together AI. AI compute platform peer hosting RWKV variants.
- Hugging Face. Open-research distribution partner.
- EleutherAI, Allen Institute for AI (Ai2), LAION, BigScience, BigCode Project. Open-research peers with different architecture-family focuses.
- Chinese academic and industry partners. Cooperation including BAAI, Shanghai AI Laboratory, and other Chinese AI research bodies.
Outlook
- Continued cadence of RWKV variant iteration through 2026 to 2027.
- Continued open-research community-coordination through the RWKV Foundation.
- Continued cross-institution research-cooperation across Chinese academic and industry partners.
- The continued open-research alternative-architecture trajectory.
Sources
- RWKV official site. Project reference.
- RWKV-LM on GitHub. Open-source training infrastructure.
- RWKV on Hugging Face. Open-weights model distribution.
- Bo Peng (BlinkDL) GitHub. Project lead reference.
- Linux Foundation. Hosting partner.