RWKV

RWKV is the open-source recurrent neural network architecture and foundation-model project led by Bo Peng, with open-research output and an alternative non-attention architecture for large-language models.
RWKV

RWKV

RWKV (Receptance Weighted Key Value, pronounced "RwaKuv") is an open-source artificial intelligence research project led by Bo Peng (also known by the pseudonym BlinkDL), developing a distinctive non-attention architecture for large foundation models that combines characteristics of recurrent neural networks (RNNs) and Transformers. The project is structured as an open-research collaboration with volunteer-contributor and Linux Foundation participation, with RWKV Foundation coordination and the Linux Foundation as a open-source-project hosting partner. As of April 2026, RWKV is one of the principal alternative-architecture open-research foundation-model projects, with open-weights release through Hugging Face and cross-institution research-cooperation including Chinese academic and industry partners.

At a glance

  • Founded: Initial RWKV architecture proposed 2021 by Bo Peng. RWKV Foundation established 2023.
  • Status: Open-research project. RWKV Foundation as principal coordination body. Linux Foundation hosting partner.
  • Funding: Volunteer-contributor and donor support. Selected commercial partners and Chinese strategic investor support through adjacent commercial entities.
  • Lead: Bo Peng (BlinkDL), Project Lead.
  • Other notable leadership: Senior research and engineering volunteers across the broader RWKV community.
  • Open weights: Yes. RWKV foundation models are released open-weights through Hugging Face.
  • Flagship outputs: RWKV-4, RWKV-5, RWKV-6 (Eagle), RWKV-7 (Goose) foundation-model variants; published research output on the RWKV architecture; the RWKV-LM training infrastructure.

Origins

RWKV was initially proposed in 2021 by Bo Peng (BlinkDL) as a distinctive non-attention architecture combining characteristics of recurrent neural networks and Transformers, with emphasis on parallel-trainable RNN-equivalent capability. The 2022 to 2023 emergence of open-research interest in alternative non-attention architectures (Mamba, RetNet, RWKV, and other architectures) anchored RWKV research community growth.

The 2023 founding of the RWKV Foundation as the principal coordination body, alongside Linux Foundation hosting partnership, anchored sustainability. The 2023 to 2026 period has continued the RWKV variant iteration through RWKV-4, RWKV-5, RWKV-6 (Eagle), and RWKV-7 (Goose), with open-weights release.

The cross-institution research-cooperation including Chinese academic and industry partners has anchored Chinese-language foundation-model variant development through 2024 to 2026.

Mission and strategy

RWKV's mission is to advance the RWKV non-attention foundation-model architecture as an open-research alternative to attention-based architectures. The strategy combines two threads. First, RWKV variant iteration with open-weights release. Second, open-research community-coordination through the RWKV Foundation and Linux Foundation hosting.

Distribution channels include open-weights distribution through Hugging Face, the RWKV-LM training infrastructure on GitHub, and cross-institution research-cooperation.

Models and products

  • RWKV foundation-model variants. RWKV-4, RWKV-5, RWKV-6 (Eagle), RWKV-7 (Goose). Open-weights through Hugging Face.
  • RWKV-LM training infrastructure. Open-source training framework on GitHub.
  • Published research output. On the RWKV architecture and alternative non-attention foundation-model approaches.

Distribution channels include open-weights distribution and cross-institution research-cooperation.

Benchmarks and standing

RWKV's evaluation framework focuses on open-research output, alternative-architecture benchmark performance, and open-weights distribution metrics. The RWKV architecture has been characterized in alternative-architecture industry coverage as one of the principal non-attention foundation-model architectures alongside Mamba, RetNet, and other alternative-architecture research lines.

Leadership

As of April 2026, RWKV's project leadership includes:

  • Bo Peng (BlinkDL), Project Lead.
  • Senior research and engineering volunteers across the broader RWKV community.

Funding and backers

Volunteer-contributor and donor support through the RWKV Foundation. Selected commercial partners and Chinese strategic investor support through adjacent commercial entities.

Industry position

RWKV occupies a distinctive position as one of the principal alternative-architecture open-research foundation-model projects, with the RWKV non-attention architecture, the RWKV Foundation coordination, the Linux Foundation hosting partnership, and open-weights release.

Competitive landscape

Outlook

  • Continued cadence of RWKV variant iteration through 2026 to 2027.
  • Continued open-research community-coordination through the RWKV Foundation.
  • Continued cross-institution research-cooperation across Chinese academic and industry partners.
  • The continued open-research alternative-architecture trajectory.

Sources

About the author
Nextomoro

AI Research Lab Intelligence

nextomoro tracks progress for AI research labs, models, and what's next.

AI Research Lab Intelligence

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to AI Research Lab Intelligence.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.