Preferred Networks
Preferred Networks (PFN) is an artificial intelligence and machine-learning company headquartered in Tokyo, Japan, founded in March 2014 by Toru Nishikawa and Daisuke Okanohara. It develops the PLaMo family of Japanese-language foundation models, the PFN-Hub open-weights distribution, the MN-Core custom AI accelerator chips, and industrial AI deployments with Toyota (autonomous driving and personal mobility), Hitachi, Fanuc, and other Japanese industrial partners. As of April 2026, PFN is one of the principal Japanese AI companies, with backing from Toyota, Mitsui & Co., NTT, and Hitachi, and a industrial-AI commercial position alongside the open-research output of Sakana AI in Japan.
At a glance
- Founded: March 2014 in Tokyo, Japan, by Toru Nishikawa and Daisuke Okanohara as a spinoff from Preferred Infrastructure (a search-engine and natural-language-processing company founded in 2006 by Okanohara, Nishikawa, and other University of Tokyo researchers).
- Status: Private. Japanese strategic-investor base.
- Funding: Approximately $200 million-plus cumulative private capital, with strategic-investor commitments from Toyota Motor Corporation (the principal strategic investor), Mitsui & Co., Hitachi, NTT, JXTG (now ENEOS), Fanuc, and other investors.
- CEO: Toru Nishikawa, Co-Founder and Chief Executive Officer.
- Other notable leadership: Daisuke Okanohara, Co-Founder and Executive Vice President; natural-language-processing and machine-learning researcher.
- Open weights: Yes, partial. Selected PLaMo variants and other research outputs released open-weights through Hugging Face. The Chainer deep-learning framework was open-source; PFN transitioned its principal deep-learning framework from Chainer to PyTorch in 2019.
- Flagship products: PLaMo (Japanese-language foundation model family, including PLaMo Lite for edge deployments), MN-Core (custom AI accelerator chips), PFN-Hub (open-weights model distribution), the open-source Chainer deep-learning framework (now superseded), and industrial AI deployments with Toyota, Hitachi, Fanuc, and other Japanese industrial partners.
Origins
Preferred Networks was founded in March 2014 in Tokyo by Toru Nishikawa and Daisuke Okanohara as a spinoff from Preferred Infrastructure, the search-engine and natural-language-processing company that Okanohara, Nishikawa, and other University of Tokyo researchers had founded in 2006. The company's founding mandate was to commercialize machine-learning and deep-learning research for Japanese industrial customers, with early commercial focus on the Toyota partnership for autonomous driving and other industrial-AI applications.
The 2014 to 2018 founding period saw PFN research output. The Chainer deep-learning framework, released as open-source in 2015, became one of the principal deep-learning frameworks globally and was substantially used in research and industrial deployments through 2017 to 2019. The Toyota partnership for autonomous driving anchored PFN's commercial credibility, with Toyota investing substantially in PFN through subsequent rounds.
The 2019 transition saw PFN substantially transition its principal deep-learning framework from Chainer to PyTorch, in coordination with Meta AI / FAIR and the broader PyTorch maintainer community. The Chainer-to-PyTorch transition reflected the industry consolidation around PyTorch and TensorFlow as the principal deep-learning frameworks globally.
The 2020 to 2023 period saw PFN's diversification into custom AI silicon (MN-Core), continued industrial AI deployments (with Toyota, Hitachi, Fanuc, and other partners), and early Japanese-language foundation-model research. The MN-Core series of custom AI accelerator chips, developed in partnership with Japanese semiconductor partners, has been deployed substantially in PFN's internal training infrastructure.
The 2024 to 2026 period has seen PFN's most consequential public-research output. The PLaMo family of Japanese-language foundation models, with PLaMo 8B and PLaMo 100B variants released through 2024 and 2025, has been positioned as the principal commercial Japanese-language foundation model family. The PLaMo Lite variant for edge deployments and the MN-Core silicon production have continued the industrial-AI scale-out.
Mission and strategy
Preferred Networks' stated mission is to advance the Internet of Things and machine-learning applications for Japanese industrial customers, with emphasis on autonomous-driving and personal-mobility partnerships, industrial automation, and Japanese-language AI capability. The strategic premise reflects the Japanese industrial-AI market positioning, with PFN explicitly positioning itself as the principal Japanese AI company commercializing deep-learning research for industrial customers.
The strategy combines four threads. First, the industrial AI deployments with Toyota (autonomous driving, personal mobility), Hitachi (industrial automation), Fanuc (robotics), and other Japanese industrial partners. Second, the Japanese-language foundation model research through the PLaMo family. Third, the custom AI accelerator chip development through MN-Core. Fourth, open-research output through PFN-Hub and the broader open-source contributions.
The competitive premise is that the Japanese industrial-AI market, the Toyota strategic partnership, the Japanese-language foundation model capability, and the custom AI silicon vertical integration provide PFN with a durable structural advantage as the principal Japanese commercial AI company.
Distribution channels include direct commercial relationships with Japanese industrial customers, the Toyota strategic partnership for autonomous driving and personal mobility, the open-weights distribution through Hugging Face for selected research outputs, and commercial relationships across the broader Japanese industrial sector.
Models and products
- PLaMo family. PLaMo 8B, PLaMo 100B, PLaMo Lite (edge variant), and other Japanese-language foundation model variants. Selected variants released open-weights through Hugging Face. The principal commercial Japanese-language foundation model family.
- MN-Core. Custom AI accelerator chips developed in partnership with Japanese semiconductor partners. MN-Core 2 (released 2023), MN-Core series continues with production deployment in PFN's internal training infrastructure.
- PFN-Hub. Open-weights model distribution through Hugging Face for selected PFN research outputs.
- Chainer. Open-source deep-learning framework released by PFN in 2015; superseded by PyTorch in 2019. Significant historical contribution to the open-source deep-learning ecosystem.
- Industrial AI deployments. Commercial deployments with Toyota (autonomous driving), Hitachi (industrial automation), Fanuc (robotics), and other Japanese industrial partners.
Distribution channels include direct commercial relationships with Japanese industrial customers, the open-weights distribution through PFN-Hub on Hugging Face, the Toyota strategic partnership for autonomous driving and personal mobility, and commercial relationships across the broader Japanese industrial sector.
Benchmarks and standing
Preferred Networks' evaluation framework focuses on Japanese-language AI capability, industrial-AI deployment metrics, and commercial performance metrics rather than horizontal foundation-model leaderboards. The PLaMo family has been consistently characterized in Japanese-language NLP industry coverage as the principal commercial Japanese-language foundation model family, with benchmark performance on Japanese-language tasks compared to multilingual models from frontier AI labs.
The Toyota strategic partnership for autonomous driving has continued to anchor PFN's industrial-AI credibility through 2024 to 2026, with deployment-quality and operational-reliability metrics serving as the principal evaluation framework. The MN-Core custom AI accelerator chips have been characterized in industry coverage as competitive on inference-cost metrics for selected Japanese-language workloads, although standardized comparisons against NVIDIA Research GPU clusters and other custom-silicon training infrastructure are limited.
PFN's continued industrial-AI deployments with Toyota, Hitachi, Fanuc, and other Japanese industrial partners have continued to anchor the company's commercial credibility through 2024 to 2026.
Leadership
As of April 2026, Preferred Networks' senior leadership includes:
- Toru Nishikawa, Co-Founder and Chief Executive Officer. Engineering leader; former University of Tokyo researcher.
- Daisuke Okanohara, Co-Founder and Executive Vice President. Natural-language-processing and machine-learning researcher; one of the principal architects of the PLaMo family.
- Senior research-and-engineering leadership across the principal program areas (PLaMo, MN-Core, industrial AI deployments).
Departures and arrivals are continuous. The 2024 to 2026 period has seen continued senior engineering recruitment and engineering scale-out across the PLaMo and MN-Core programs.
Funding and backers
PFN's funding history reflects Japanese strategic-investor backing through multiple rounds. The principal strategic investors include Toyota Motor Corporation (the principal strategic investor through multiple rounds), Mitsui & Co. (Japanese trading-house investor), Hitachi (Japanese industrial investor), NTT (Japanese telecommunications investor), JXTG (now ENEOS, Japanese energy investor), and Fanuc (Japanese robotics investor). Cumulative private capital reaches approximately $200 million-plus through 2026.
The Japanese strategic-investor base provides PFN with financial-runway certainty. Open questions on near-term financing are limited compared to standalone AI labs, given the Japanese strategic-investor backing and the commercial revenue base from Japanese industrial customers.
Industry position
Preferred Networks occupies a structurally distinctive position as the principal Japanese commercial AI company, with industrial-AI deployments across Toyota, Hitachi, Fanuc, and other Japanese industrial partners, the Japanese-language PLaMo foundation model family, the MN-Core custom AI accelerator chip development, and open-research history through the Chainer deep-learning framework.
Industry coverage has consistently characterized PFN as one of the principal Japanese AI companies globally, alongside Sakana AI (the founder-led research-focused alternative founded by former Google Brain researchers), the NTT and Fujitsu corporate AI research operations, and other Japanese AI initiatives. The 2024 to 2026 period has seen PFN characterized in industry coverage as the principal Japanese commercial AI company with industrial-AI deployment depth.
Competitive landscape
- Sakana AI. Japanese AI peer with a different research-focused founder-led architecture.
- Toyota Research Institute. Toyota industrial-research peer with overlap on autonomous-mobility and robotics research; PFN's Toyota strategic partnership creates cooperation alongside the industrial-AI competition.
- NVIDIA Research. Industrial-research peer with overlap on AI accelerator silicon (NVIDIA GPUs vs. PFN MN-Core), industrial-AI deployments, and the underlying compute platform.
- Aleph Alpha, Mistral AI, Krutrim, G42. Non-US-non-Chinese sovereign and quasi-sovereign AI peers with overlap on the sovereign-AI strategic positioning.
- Hugging Face. Open-research distribution partner; PFN distributes selected open-weights variants through PFN-Hub on Hugging Face.
- NTT, Fujitsu corporate AI research divisions. Japanese corporate AI research peers.
Outlook
- The continued cadence of PLaMo Japanese-language foundation model releases through 2026 and 2027.
- The continued MN-Core custom AI accelerator chip production and the 2026 to 2027 capacity commitments.
- The Toyota strategic partnership trajectory and the autonomous-mobility commercial rollout.
- The continued industrial AI deployments with Toyota, Hitachi, Fanuc, and other Japanese industrial partners.
- The competitive dynamic with Sakana AI and other Japanese AI initiatives.
- Continued senior research-talent recruitment and senior leadership stability through the 2026 to 2027 commercial expansion.
Sources
- Preferred Networks official site. Company reference.
- PFN PLaMo announcement. PLaMo family release announcement.
- PFN MN-Core project page. MN-Core custom AI silicon reference.
- Preferred Networks on Hugging Face. Open-weights model distribution.
- Chainer deep-learning framework. PFN's earlier open-source deep-learning framework.