Guillaume Lample
Guillaume Lample is a French computer scientist, born October 8, 1990 in Brest, France. He is the co-founder and Chief Science Officer of Mistral AI, the Paris-headquartered foundation-model company he established in April 2023 with Arthur Mensch and Timothée Lacroix, and the last-listed senior author on the February 2023 LLaMA paper at Meta AI FAIR Paris. As of May 2026, he leads model research at Mistral following the September 2025 €2 billion Series C at a €12 billion valuation led by ASML.
At a glance
- Education: Ingénieur degree, École Polytechnique (class of 2011); Master of Language Technologies, Carnegie Mellon University Language Technologies Institute (2017); PhD in computer science (CIFRE thesis), Sorbonne Université (UPMC) and Facebook AI Research, 2016 to 2019, advised by Ludovic Denoyer at the LIP6 MLIA team.
- Current role: Co-founder and Chief Science Officer of Mistral AI since April 2023.
- Key contributions: last-listed senior author on LLaMA: Open and Efficient Foundation Language Models (February 2023); co-author on Phrase-Based & Neural Unsupervised Machine Translation (EMNLP 2018) and Cross-lingual Language Model Pretraining (XLM, 2019); technical leadership on Mistral 7B, Mixtral 8x7B, and successor model families.
- X / Twitter: @GuillaumeLample
- LinkedIn: guillaume-lample-7821095b
- GitHub: glample
- Google Scholar: H7sVDmIAAAAJ
Origins
Lample was born on October 8, 1990 in Brest, the port city in Brittany on the western coast of France. He entered École Polytechnique in the class of 2011 for the engineer's degree program, the same period in which his future Mistral co-founder Arthur Mensch was also a Polytechnique student. The third co-founder, Timothée Lacroix, entered École Normale Supérieure the same year; the three came together later through the overlapping Paris machine-learning research community at FAIR Paris and Inria.
After Polytechnique, Lample completed a Master of Language Technologies at Carnegie Mellon University's School of Computer Science, advised by Alex Waibel and Chris Dyer. The most-cited project from the period was Arnold, a Doom-playing reinforcement-learning agent co-developed with fellow CMU graduate student Devendra Chaplot, which won the Track 1 prize at the 2016 ViZDoom AI Competition and produced the Playing FPS Games with Deep Reinforcement Learning paper.
Career
Lample joined Facebook AI Research (FAIR) in Paris in 2016 as a doctoral researcher under a CIFRE arrangement, the French industry-academic doctoral funding mechanism, and pursued his PhD in parallel at Sorbonne Université (then UPMC) in the LIP6 Machine Learning and Information Access (MLIA) team under Ludovic Denoyer. The thesis, "Unsupervised Machine Translation", was defended on October 17, 2019, with Marc'Aurelio Ranzato of FAIR on the jury. The thesis line generated three of his most-cited papers: Unsupervised Machine Translation Using Monolingual Corpora Only (ICLR 2018), Phrase-Based & Neural Unsupervised Machine Translation (EMNLP 2018, Best Paper Award), and Cross-lingual Language Model Pretraining (the XLM paper, 2019), each co-authored with Alexis Conneau at FAIR.
Lample stayed at FAIR Paris as a research scientist after completing the PhD and continued work across multilingual NLP, language modeling, and a side line in symbolic mathematics. The Deep Learning for Symbolic Mathematics paper at ICLR 2020 with François Charton showed that sequence-to-sequence transformers could outperform commercial computer-algebra systems including Matlab and Mathematica on symbolic integration and ODE solving.
The artifact most widely associated with Lample is the LLaMA paper, submitted to arXiv on February 27, 2023 and titled "LLaMA: Open and Efficient Foundation Language Models". The 14-author byline lists Lample last (the position commonly used for the senior author with overall research direction) with Hugo Touvron as first author. The paper introduced the foundation-model family in 7B, 13B, 33B, and 65B parameter sizes, established the training-on-public-data-only and open-weights-research-license posture that defined the LLaMA series, and triggered the wave of open-weights frontier-class language models that followed through 2023 and 2024. Several future Mistral arrivals appear on the byline alongside Lacroix at sixth position, including Baptiste Rozière, Marie-Anne Lachaux, and Thibaut Lavril.
Lample left Meta in early 2023 to co-found Mistral AI in Paris with Arthur Mensch and Timothée Lacroix on April 28, 2023, taking the Chief Science Officer title. The €105 million seed round in June 2023, led by Lightspeed Venture Partners with Eric Schmidt, Xavier Niel, and JCDecaux, was among the largest seed rounds in European technology history at the time. Mistral 7B launched on September 27, 2023 as an Apache 2.0 open-weights release, followed by Mixtral 8x7B on December 9, 2023, the first widely deployed sparse mixture-of-experts model from a frontier-class lab.
The Mistral Large flagship line launched in February 2024 alongside the Le Chat consumer assistant. The €600 million Series B closed in June 2024, followed by Mistral Large 2 in July 2024 and the Codestral, Mathstral, and Pixtral multimodal lines later in the year. The €2 billion Series C in September 2025 was led by ASML at a €12 billion valuation, and Mistral Large 3 and the Ministral 3 family followed in December 2025. Lample's role across the period has been research leadership for Mistral's training pipeline and successor model architectures.
Affiliations
- Carnegie Mellon University, Language Technologies Institute: Master of Language Technologies student, 2014 to 2017.
- Sorbonne Université (UPMC), LIP6 MLIA team: PhD candidate (CIFRE), 2016 to 2019.
- Meta AI (FAIR Paris): Research Scientist, 2016 to early 2023.
- Mistral AI: Co-founder and Chief Science Officer, April 2023 to present.
Notable contributions
Lample's published record runs from reinforcement-learning game-playing agents at CMU to the multilingual-NLP and unsupervised-machine-translation line at FAIR, to symbolic-mathematics deep learning, to the LLaMA family, to the Mistral and Mixtral models. He has approximately 64,000 Google Scholar citations and an h-index of 41 as of mid-2026.
- LLaMA: Open and Efficient Foundation Language Models (February 2023). Last-listed senior author on the 14-author paper introducing the LLaMA model family. The paper is the most-cited artifact in his record at over 27,000 citations and triggered the wave of open-weights frontier-class language-model releases.
- Mistral AI founding (April 2023). Co-founder and Chief Science Officer.
- Mistral 7B (September 2023). Foundational open-weights release under Apache 2.0; co-author on the Mistral 7B technical paper.
- Mixtral 8x7B (December 2023). First widely deployed sparse mixture-of-experts model from a frontier-class lab.
- Mistral Large family (February 2024 to present). Closed-weights flagship line, including Mistral Large 2 (July 2024) and Mistral Large 3 (December 2025).
- Cross-lingual NLP and unsupervised machine translation line (2017 to 2019). Unsupervised Machine Translation Using Monolingual Corpora Only (ICLR 2018), Phrase-Based & Neural Unsupervised Machine Translation (EMNLP 2018, Best Paper Award), and Cross-lingual Language Model Pretraining (the XLM paper, 2019). The line established methods for translating without parallel training data and seeded the multilingual-encoder direction that followed.
- Deep Learning for Symbolic Mathematics (ICLR 2020). Co-author with François Charton of the paper showing transformer models could outperform Matlab and Mathematica on symbolic integration and ODEs.
- Neural Architectures for Named Entity Recognition (NAACL 2016). The CMU-period BiLSTM-CRF paper, an early NER reference with over 6,000 citations.
- Playing FPS Games with Deep Reinforcement Learning (AAAI 2017). The Arnold Doom-playing agent paper from CMU.
Investments and boards
- Mistral AI (AI): Co-founder and Chief Science Officer, April 2023 to present. Privately held; approximately $4 billion in cumulative equity funding plus $830 million in debt. Lample crossed the billionaire threshold alongside his co-founders in 2024 per Challenges magazine.
No public personal angel-investor activity is on record in AI, semiconductors, datacenters, software, or energy outside the founder-and-operator role at Mistral AI as of May 2026.
Network
Lample's longest-running professional relationships are with his Mistral co-founders Arthur Mensch (Chief Executive Officer) and Timothée Lacroix (Chief Technology Officer). Lample and Mensch overlapped at École Polytechnique in the 2011 class; Lacroix was at École Normale Supérieure the same year. The three came together professionally through the Paris machine-learning research community at FAIR Paris, Inria, and Sorbonne / LIP6. His PhD advisor was Ludovic Denoyer at Sorbonne / LIP6, and his master's-era CMU advisors were Alex Waibel and Chris Dyer. His most consistent FAIR research collaborator was Alexis Conneau, with whom he co-authored the unsupervised-machine-translation and XLM lines.
The LLaMA-paper byline links him to Hugo Touvron as first author and to several authors who later followed him to Mistral, including Baptiste Rozière, Marie-Anne Lachaux, and Thibaut Lavril. Press coverage in 2025 noted that 11 of the 14 LLaMA-paper authors had left Meta by mid-2025, with five joining Mistral. Yann LeCun was the chief AI scientist of Meta during his FAIR period.
Among broader frontier-research peers, the Mistral founding period has placed Lample alongside Sam Altman of OpenAI, Dario Amodei of Anthropic, Demis Hassabis of Google DeepMind, Aidan Gomez of Cohere, and Ilya Sutskever of Safe Superintelligence on conference and policy panels, though his public-facing presence is substantially lower-profile than Mensch's.
Position in the field
As of May 2026, Lample sits among a small group of senior research-leadership figures who moved from a frontier industrial lab into a chief-science role at an independent venture-backed company. The LLaMA-paper senior-authorship line is the structurally distinguishing credential: he is the only named co-founder of a current frontier-class lab whose name appears as the senior author on a foundational open-weights LLM paper from the prior employer.
Press coverage has consistently identified the three Mistral co-founders as filling complementary roles. Mensch is the public face on funding strategy, EU AI policy, and capability claims; Lacroix focuses on infrastructure and engineering systems; Lample carries technical leadership on model research and the training pipeline. Coverage has described Lample as research-focused and less media-facing than Mensch, with public communications channeled through product launches, X commentary, and a small number of conference and podcast appearances.
The artifact most often cited as evidence of Lample's continued technical leadership is the Mixtral 8x7B sparse mixture-of-experts release, which preceded comparable production-scale MoE releases from US-domiciled peers. Successor releases including Mistral Large 2 and Mistral Large 3 place Mistral below the latest US flagships on closed-weights benchmarks but ahead of any other European frontier-class model line.
Outlook
Open questions over the next 6 to 18 months:
- Mistral Large 3 and successor benchmark performance. Whether Mistral closes the capability gap to the latest OpenAI, Anthropic, and Google DeepMind flagships once third-party leaderboards complete their evaluations.
- Open-weights versus closed-weights research direction. Whether Lample-led model research continues to produce open-weights releases at frontier capability or shifts toward closed-weights flagship differentiation.
- Bruyères-le-Châtel and Sweden compute build-out. Training capacity available to the research team through 2026 and 2027, and the resulting cadence of model releases.
- Talent pipeline from FAIR Paris. Whether the migration of former FAIR LLaMA-team researchers to Mistral continues.
- Research-paper output. Whether Mistral publishes technical papers on its frontier training methods at the cadence of US peers, or maintains a more product-focused publication posture.
Sources
- Guillaume Lample on Wikidata. Wikidata record covering birth, education, and identifiers.
- LLaMA: Open and Efficient Foundation Language Models. The February 2023 Meta AI paper introducing the LLaMA model family, with Lample as the last-listed senior author.
- Mistral 7B. The September 2023 Mistral AI announcement of the foundational open-weights release.
- Mixtral of Experts. The December 2023 Mistral AI announcement of Mixtral 8x7B.
- Mistral 7B paper. The October 2023 Mistral 7B technical paper.
- Phrase-Based & Neural Unsupervised Machine Translation. The EMNLP 2018 Best Paper, a flagship FAIR-period unsupervised-MT result with Conneau, Ott, Denoyer, and Ranzato.
- Cross-lingual Language Model Pretraining. The 2019 XLM paper introducing cross-lingual masked-language-model pretraining.
- Deep Learning for Symbolic Mathematics. The ICLR 2020 paper with François Charton on transformer models for symbolic integration and ODE solving.
- Neural Architectures for Named Entity Recognition. The NAACL 2016 BiLSTM-CRF NER paper.
- Playing FPS Games with Deep Reinforcement Learning. The 2017 AAAI Doom-agent paper from the CMU master's period, with Devendra Chaplot.
- Guillaume Lample's CMU LTI alumni page. Carnegie Mellon Language Technologies Institute alumni record for the 2017 master's degree.
- Guillaume Lample's PhD thesis record at LIP6. Sorbonne / LIP6 record of the October 2019 PhD defense, advisor, and team affiliation.
- Guillaume Lample's Google Scholar profile. Citation metrics, h-index, and chronological publication record.
- Mistral 7B's secrets: a talk by Guillaume Lample. November 2023 Photoroom X-IA talk on the Mistral 7B training methodology.
- Mistrals 3 founders Timothee Lacroix, Arthur Mensch and Guillaume Lample become first AI billionaires in France. Crain Currency coverage of the founders' billionaire status per Challenges magazine.
- Meta's Llama AI team has been bleeding talent. May 2025 coverage of LLaMA-team departures from Meta to Mistral, Anthropic, Google DeepMind, and others.
- The AI 'genius' behind Mistral's meteoric rise. Sifted profile of Lample and his role at Mistral.
- Mistral AI. Wikipedia entry on the company, including funding rounds, model releases, and partnership history.