Liam Fedus

Liam Fedus is an American computer scientist, the co-founder and chief executive officer of Periodic Labs and a former vice president of research for post-training at OpenAI who led the team behind ChatGPT, GPT-4o, and the o1-series reasoning models.
Liam Fedus

Liam Fedus

Liam Fedus is an American computer scientist and machine-learning researcher. He is the co-founder and chief executive officer of Periodic Labs, the 2025 San Francisco startup combining AI scientists with autonomous robotic laboratories to accelerate materials discovery, with Ekin Dogus Cubuk as fellow co-founder. He was previously vice president of research for post-training at OpenAI from 2022 through March 2025, and a senior research scientist at Google Brain from 2020 through 2022. He published academic work as William Fedus and goes by Liam Fedus in industry contexts.

At a glance

Origins

Fedus is American. He completed undergraduate study at the Massachusetts Institute of Technology from 2006 through 2010, earning a bachelor of science in physics. As an undergraduate at MIT he worked on the Dark Matter Time Projection Chamber project in the laboratory for nuclear science, an experimental directional dark-matter detector designed to identify nuclear recoil tracks from weakly interacting massive particles. The undergraduate research informed his subsequent attention to physical-experiment instrumentation and to the framing of scientific discovery as an experimentally grounded, rather than purely computational, activity.

He completed a master of science in physics at the University of California, San Diego from 2013 through 2016, before transitioning into computer science for his doctoral training. The PhD ran from 2017 through 2020 at the Université de Montréal under co-supervision by Yoshua Bengio and Hugo Larochelle, in residence at Mila, the Quebec Artificial Intelligence Institute. The Mila period produced his early academic publication record on generative adversarial networks, deep reinforcement learning, and the foundations of large-scale neural-network training.

Career

Fedus joined Google Brain after the Mila PhD, working on neural-network architecture and large-scale model efficiency alongside Noam Shazeer and Barret Zoph. The principal artifact of the Google Brain period is the January 2021 Switch Transformer paper with Zoph and Shazeer, published in the Journal of Machine Learning Research in 2022. The paper introduced a simplified mixture-of-experts (MoE) routing scheme and demonstrated that sparse-activation models could be scaled to a trillion parameters with up to a sevenfold pre-training speedup over dense T5 baselines. Switch Transformers became the reference architecture for the open-source MoE language models that emerged in 2023 and 2024.

In 2022 Fedus joined OpenAI, initially as a senior research scientist on the reinforcement-learning team. He has described his early role as data-flywheel and model-evaluation work on the team that produced ChatGPT, launched in November 2022. The OpenAI GPT-4 contributors page of March 2023 lists him as data flywheel lead, with additional contributions to model-graded evaluation infrastructure and dataset construction.

His responsibility expanded across the post-training stack through 2023 and 2024, encompassing supervised fine-tuning, reinforcement learning from human feedback (RLHF), and reasoning-specific training pipelines. He led post-training research and development for GPT-4o, o1-mini, and o1-preview, and is listed among the principal contributors on the OpenAI o1 contributions page.

In October 2024 Fedus was promoted to vice president of research for post-training, succeeding Barret Zoph, who had departed for Thinking Machines Lab alongside Mira Murati. His VP tenure spanned the period of senior post-training departures that included his predecessor Luke Metz (to Thinking Machines Lab).

On March 17, 2025 Fedus announced his own departure from OpenAI to co-found a materials-science AI startup. The departure note posted to X stated his intention to "work closely together as a partner going forward" with OpenAI and to apply post-training methodology to the physical sciences. Within days, multiple venture-capital firms approached the new company, with Felicis Ventures partner Peter Deng cutting the first check and Andreessen Horowitz subsequently leading the seed round.

In September 2025 the new company emerged from stealth as Periodic Labs, with a $300 million seed round at a $1.3 billion post-money valuation led by Andreessen Horowitz and Felicis, plus DST Global, NVIDIA's NVentures arm, Accel, Jeff Bezos, Eric Schmidt, Elad Gil, and Jeff Dean. Fedus serves as chief executive officer alongside Ekin Dogus Cubuk, former chemistry and physics research lead at Google DeepMind and a co-author of the GNoME materials-discovery research line. The pair had met approximately seven years earlier at Google Brain. As of March 2026 the company was reported to be in deal talks at an approximately $7 billion follow-on valuation.

Affiliations

  • Google Brain: Senior research scientist, approximately 2020 to 2022.
  • OpenAI: Senior research scientist through Vice President of Research, Post-Training, 2022 to March 2025.
  • Periodic Labs: Co-founder and chief executive officer, March 2025 to present.

Notable contributions

Fedus's body of public work centers on the Switch Transformer at Google Brain, the post-training pipeline at OpenAI, and the founding of Periodic Labs.

  • Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity (January 2021, JMLR 2022). Lead author with Barret Zoph and Noam Shazeer at Google Brain. Switch Transformer simplified MoE routing into a single-expert selection per token, scaling sparse-activation models to a trillion parameters at up to a sevenfold pre-training speedup over dense baselines. The paper has accumulated over 4,500 citations.
  • ChatGPT post-training and data flywheel (2022). Co-creator of ChatGPT as a member of the small team that converted the underlying foundation model into the deployed conversational product.
  • GPT-4 data flywheel (2023). Listed contributor as data flywheel lead, with additional contributions to evaluation infrastructure and dataset construction.
  • GPT-4o (May 2024). Post-training research and development lead on the multimodal flagship released with image, voice, and text capability in a unified model.
  • o1-preview and o1-mini (September 2024). Post-training research lead on the o-series reasoning models that introduced large-scale reinforcement learning on chain-of-thought reasoning at production scale.
  • Periodic Labs co-founding (March 2025). Co-founder and chief executive officer alongside Ekin Dogus Cubuk. The company has raised $300 million in seed financing at a $1.3 billion valuation, established its San Francisco headquarters, recruited senior post-training and materials-science researchers including Alexandre Passos and Eric Toberer, and stated its first commercial target as new superconducting materials.
  • Public-talk record. Building an AI Physicist: ChatGPT Co-Creator's Next Venture on the a16z Podcast with Anjney Midha and Ekin Dogus Cubuk (September 2025); AI for Atoms on No Priors with Elad Gil (April 2026); How to Create an AI Scientist GPU Mode keynote at Accel (January 2026); Q&A with the OpenAI ChatGPT team on MIT Technology Review with Will Douglas Heaven (March 2023), alongside Sandhini Agarwal, Jan Leike, and John Schulman.

Investments and boards

The entries below are limited to AI, semiconductors, datacenters, software, and energy.

  • Periodic Labs (AI): Co-founder and chief executive officer, March 2025 to present. San Francisco-based AI-for-science lab. Cumulative funding $300 million through April 2026 at a $1.3 billion seed-round valuation, with March 2026 reports of deal talks at an approximately $7 billion follow-on valuation.

No other public investor activity on record in AI, semiconductors, datacenters, software, or energy as of May 2026.

Network

Fedus's longest-running professional collaboration is with Ekin Dogus Cubuk, his Periodic Labs co-founder. The pair met at Google Brain approximately seven years before the Periodic Labs founding and developed the AI-scientist thesis through ongoing conversations during Fedus's OpenAI period and Cubuk's Google DeepMind period. Cubuk's GNoME materials-discovery research line at DeepMind, which produced approximately 2.2 million novel stable crystals, is the materials-science counterpart to Fedus's post-training research.

His Google Brain period from 2020 through 2022 produced the Switch Transformer collaboration with Barret Zoph and Noam Shazeer. Both co-authors followed parallel trajectories: Shazeer co-founded Character.AI in 2021 and returned to Google in 2024, and Zoph followed Fedus to OpenAI before founding Thinking Machines Lab with Mira Murati. Yoshua Bengio and Hugo Larochelle, his Mila co-advisors, anchor the Mila research network that has supplied senior researchers across the contemporary frontier-lab cohort.

The OpenAI period from 2022 through March 2025 produced his closest peer relationships with Sam Altman, Greg Brockman, Ilya Sutskever, Mira Murati, John Schulman, Lilian Weng, Andrej Karpathy, Sandhini Agarwal, and Jan Leike. The reasoning-model research line connected him to senior staff including Jakub Pachocki, Jerry Tworek, and Mark Chen on the o-series program.

Among Insurgent-lab founder peers, his Periodic Labs position runs in parallel with Mira Murati and Barret Zoph at Thinking Machines Lab, Ilya Sutskever at Safe Superintelligence, Misha Laskin at Reflection AI, and Jason Warner at poolside, with closer thematic adjacency to Isomorphic Labs's drug-discovery framing.

Position in the field

As of May 2026, Fedus occupies a structurally distinctive position among Insurgent-lab chief executives through the combination of the MIT physics undergraduate background, the Mila PhD under Bengio and Larochelle, the Switch Transformer lead authorship at Google Brain, the post-training leadership at OpenAI through the ChatGPT, GPT-4o, and o-series releases, and the AI-for-physical-sciences thesis at Periodic Labs.

Industry coverage has consistently characterized Periodic Labs under his leadership as the principal AI-for-scientific-discovery startup outside of Google DeepMind's AlphaFold and Isomorphic Labs's drug-design programs, and as the central commercial vehicle for the closed-loop autonomous-laboratory architecture that pairs frontier AI models with robotic experiment execution. The September 2025 seed round was the largest disclosed seed round in venture-capital history at the time, exceeded subsequently by Mira Murati's Thinking Machines Lab and by Periodic Labs's own reported follow-on talks.

The MIT-physics-to-frontier-AI arc has parallels in the senior frontier-research cohort. John Schulman (Caltech physics undergrad, Berkeley EECS PhD), John Jumper (Vanderbilt physics undergrad, Chicago theoretical chemistry PhD), and Misha Laskin (Yale physics undergrad, Chicago theoretical many-body physics PhD) share the trajectory of physical-sciences quantitative training preceding a transition into deep learning. Fedus's record is distinctive in the explicit return to the physical sciences as the commercial application of frontier-AI methodology.

Outlook

Open questions over the next 6 to 18 months:

  • Follow-on financing. Whether the reported deal talks at an approximately $7 billion valuation close, the lead investor, and the capital deployment plan for autonomous-laboratory infrastructure scaling.
  • First demonstrated superconductor result. Whether Periodic Labs produces a publicly disclosed superconducting-materials discovery from the autonomous-lab platforms, the timing of the first publication, and the magnitude of the capability claim.
  • Application-area expansion. Whether the company expands beyond the publicly stated superconductor focus to additional materials, energy-storage, semiconductor, or pharmaceutical applications, and the structure of any commercial partnerships in those areas.
  • OpenAI partnership. Whether the announced partnership and OpenAI investment in Periodic Labs produces specific technology transfer or joint research output, distinct from the standard ex-employee venture relationship.
  • Senior-talent recruitment cadence. Continued movement of post-training and reasoning-research staff from OpenAI, Google DeepMind, and Anthropic into Periodic Labs's research and engineering organizations.
  • Public-commentary cadence. Frequency and substance of conference appearances, technical publications, and AI-policy positions on the AI-for-science framing as Periodic Labs moves from stealth to first commercial results.

Sources

About the author
Nextomoro

Nextomoro

nextomoro tracks progress for AI research labs, models, and what's next.

AI Research Lab Intelligence

nextomoro tracks progress for AI research labs, models, and what's next.

AI Research Lab Intelligence

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to AI Research Lab Intelligence.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.