OpenBMB / ModelBest
OpenBMB (开源大模型) is a Chinese open-source artificial intelligence research initiative affiliated with Tsinghua University Natural Language Processing Lab (THUNLP) and Tsinghua KEG, with open-source foundation-model research output through the MiniCPM family of efficient on-device foundation models. ModelBest (面壁智能) is the affiliated commercial company headquartered in Beijing, founded in August 2022 by Liu Zhiyuan (Tsinghua professor and one of the principal Chinese NLP researchers) and other THUNLP affiliates, commercializing the MiniCPM line and other foundation-model research with Chinese commercial customer base. As of April 2026, OpenBMB / ModelBest is one of the principal Chinese open-research foundation-model labs, with the MiniCPM line positioned as the principal efficient on-device foundation-model family from China.
At a glance
- Founded: OpenBMB initiative founded 2021 at Tsinghua University; ModelBest commercial company founded August 2022.
- Status: OpenBMB nonprofit research initiative; ModelBest private commercial company.
- Funding: ModelBest cumulative private capital. Backers include Hillhouse Capital, Rongqun Investment, Citic Securities, and other Chinese strategic investors.
- CEO: Li Dahai, Co-Founder and Chief Executive Officer of ModelBest.
- Other notable leadership: Liu Zhiyuan, Co-Founder. Tsinghua professor and Chief Scientist.
- Open weights: Yes. The MiniCPM family is released open-weights through Hugging Face.
- Flagship outputs: MiniCPM family (MiniCPM-1B, MiniCPM-2B, MiniCPM-V multimodal variants); MiniCPM-V-2.6 efficient on-device multimodal model; open-source research output.
Origins
OpenBMB was founded in 2021 as a Tsinghua University-affiliated open-source AI research initiative, with Tsinghua NLP Lab (THUNLP) and Tsinghua KEG faculty participation. The initiative's emphasis on open-source large-foundation-model research anchored Chinese open-research positioning.
ModelBest, the commercial company affiliated with the OpenBMB initiative, was founded in August 2022 by Liu Zhiyuan (Tsinghua professor) and other THUNLP affiliates, with Chinese strategic-investor backing through Hillhouse Capital, Rongqun Investment, Citic Securities, and other investors. The MiniCPM line, with explicit emphasis on efficient on-device foundation-model capability, anchored commercial-product positioning.
The 2024 to 2026 period has continued MiniCPM iteration. MiniCPM-V-2.6 (efficient on-device multimodal model) anchored commercial-product attention. Continued open-weights distribution and Chinese commercial customer expansion have continued through 2024 to 2026.
Mission and strategy
OpenBMB / ModelBest's mission is to advance efficient on-device foundation-model research with open-weights distribution. The strategy combines two threads. First, the MiniCPM efficient on-device foundation-model line. Second, Chinese commercial customer expansion across enterprise and consumer-electronics applications.
Distribution channels include open-weights distribution through Hugging Face, the OpenBMB GitHub organization, and Chinese commercial customer relationships through ModelBest.
Models and products
- MiniCPM family. MiniCPM-1B, MiniCPM-2B, MiniCPM-V multimodal variants, MiniCPM-V-2.6. Efficient on-device foundation models.
- OpenBMB open-source research output. Open-source AI research output through GitHub.
- ModelBest commercial-product expansion. Chinese commercial customer base.
Distribution channels include open-weights distribution through Hugging Face and Chinese commercial customer relationships.
Benchmarks and standing
OpenBMB / ModelBest's evaluation framework focuses on efficient on-device foundation-model benchmarks (with parameter-efficiency metrics), open-weights research output, and Chinese commercial customer adoption. The MiniCPM line has been consistently characterized in efficient-foundation-model industry coverage as one of the principal efficient on-device foundation-model families from China.
Leadership
As of April 2026, OpenBMB / ModelBest's senior leadership includes:
- Li Dahai, Co-Founder and Chief Executive Officer of ModelBest.
- Liu Zhiyuan, Co-Founder and Chief Scientist. Tsinghua professor.
- Senior research and engineering staff across the MiniCPM and OpenBMB programs.
Funding and backers
ModelBest cumulative private capital. Backers include Hillhouse Capital, Rongqun Investment, Citic Securities, and other Chinese strategic investors.
Industry position
OpenBMB / ModelBest occupies a distinctive position as one of the principal Chinese open-research foundation-model labs, with the MiniCPM efficient on-device foundation-model line, Tsinghua University academic-collaboration relationships, and Chinese commercial customer base.
Competitive landscape
- Tsinghua KEG, Tsinghua IIIS. Tsinghua-affiliated peer initiatives with cooperation.
- BAAI, Shanghai AI Laboratory. Chinese government-backed AI research peers.
- DeepSeek, Z.AI, Moonshot AI, MiniMax, Stepfun. Chinese commercial AI startup peers.
- Hugging Face. Open-research distribution partner.
- Allen Institute for AI (Ai2), EleutherAI, LAION. Non-Chinese open-research peers.
Outlook
- The continued cadence of MiniCPM iteration through 2026 to 2027.
- Continued Chinese commercial customer expansion.
- The continued Tsinghua-affiliation cooperation.
Sources
- OpenBMB on GitHub. Open-source initiative.
- MiniCPM family. Efficient on-device foundation models.
- OpenBMB on Hugging Face. Open-weights model distribution.
- Tsinghua University NLP Lab (THUNLP). Affiliated academic research lab.
- ModelBest official site. Commercial company.