Timothée Lacroix

Timothée Lacroix is a French computer scientist, co-founder and chief technology officer of Mistral AI, and a co-author of the original LLaMA paper at Meta AI Research before leaving FAIR in 2023 to start Mistral.
Timothée Lacroix

Timothée Lacroix

Timothée Lacroix is a French computer scientist, born in 1991. He is the co-founder and Chief Technology Officer of Mistral AI, the Paris-headquartered foundation-model company he established in April 2023 with Arthur Mensch and Guillaume Lample, and a co-author on the February 2023 LLaMA paper at Meta AI FAIR Paris. As of May 2026, he leads engineering and infrastructure at Mistral following the September 2025 €2 billion Series C at a €12 billion valuation led by ASML.

At a glance

Origins

Lacroix was born in 1991 in France. He was admitted to École Normale Supérieure in Paris (the rue d'Ulm campus) in the 2011 class through the standard entrance examination, with the admission recorded in the December 13, 2011 JORF decree. He studied computer science at ENS through 2015 and spent an exchange period at the University of California, San Diego in 2013.

In 2014, while still a student, Lacroix joined Facebook in London as a software engineering intern on the Graph Search team, the in-product social search feature the company was developing at the time. He continued in 2015 at the Facebook AI Research (FAIR) office between New York and Paris as a Research Engineer. The progression from product engineering to research-engineering inside the same employer foreshadowed the engineering-focused profile he carried into Mistral.

Career

Lacroix began doctoral studies in 2016 at École des Ponts ParisTech and Université Paris-Est, in the IMAGINE team of the Laboratoire d'Informatique Gaspard-Monge (LIGM), advised by Guillaume Obozinski. The thesis line, in collaboration with Nicolas Usunier at FAIR, focused on tensor-decomposition methods for link prediction and knowledge-base completion. The thesis "Décompositions tensorielles pour la complétion de bases de connaissance" was deposited in 2020 (thesis number 2020PESC1002). The doctoral work produced two of his most-cited research papers: Canonical Tensor Decomposition for Knowledge Base Completion at ICML 2018 and Tensor Decompositions for Temporal Knowledge Base Completion at ICLR 2020.

Through and after the PhD, Lacroix continued as a Research Engineer at FAIR Paris, working on tooling and infrastructure for large-scale machine-learning research. He was a co-author on the 2016 TorchCraft library for reinforcement-learning research on real-time strategy games, and on the 2019 PyTorch-BigGraph large-scale graph-embedding system. The graph-embedding work reflected the engineering posture he carried through his FAIR period: building the systems other research teams trained models on top of.

The artifact most widely associated with Lacroix is the LLaMA paper, submitted to arXiv on February 27, 2023 under the title "LLaMA: Open and Efficient Foundation Language Models". The 14-author byline lists Lacroix sixth, behind first author Hugo Touvron, Thibaut Lavril, Gautier Izacard, Xavier Martinet, and Marie-Anne Lachaux, and ahead of Baptiste Rozière. Guillaume Lample is the last-listed senior author. The paper introduced the foundation-model family in 7B, 13B, 33B, and 65B parameter sizes and triggered the wave of open-weights frontier-class language-model releases that followed through 2023 and 2024.

In April 2023, Lacroix left Meta to co-found Mistral AI with Arthur Mensch and Guillaume Lample, taking the Chief Technology Officer title and engineering-and-infrastructure leadership. The €105 million seed round in June 2023, led by Lightspeed Venture Partners with Eric Schmidt, Xavier Niel, and JCDecaux, was among the largest seed rounds in European technology history at the time. Mistral 7B launched on September 27, 2023 as an Apache 2.0 open-weights release, followed by Mixtral 8x7B on December 9, 2023, the first widely deployed sparse mixture-of-experts model from a frontier-class lab.

The Mistral Large flagship line launched in February 2024 alongside the Le Chat consumer assistant. The €600 million Series B closed in June 2024, followed by Mistral Large 2 in July 2024 and the Codestral, Mathstral, and Pixtral lines later in the year. The €2 billion Series C in September 2025 was led by ASML at a €12 billion valuation, and Mistral Large 3 and the Ministral 3 family followed in December 2025, with an $830 million debt financing closing in March 2026 for the Bruyères-le-Châtel facility near Paris and a Sweden site. Lacroix's role across the period has been engineering leadership for the training and inference stack that turns research models into production releases.

Affiliations

  • École Normale Supérieure (rue d'Ulm): Student in computer science, admitted 2011 class.
  • Facebook (London): Software engineering intern, Graph Search team, 2014.
  • Facebook AI Research and Meta AI: Research Engineer, 2015 to April 2023, including the doctoral period 2016 to 2020.
  • École des Ponts ParisTech and Université Paris-Est, IMAGINE team at LIGM: PhD candidate, 2016 to 2020.
  • Mistral AI: Co-founder and Chief Technology Officer, April 2023 to present.

Notable contributions

Lacroix's published record runs from research-infrastructure tooling at FAIR to the tensor-decomposition and knowledge-base-completion line at École des Ponts, to the LLaMA family, to the Mistral and Mixtral models. His Google Scholar profile lists approximately 34,000 citations and an h-index of 22 as of mid-2026.

Investments and boards

  • Mistral AI (AI): Co-founder and Chief Technology Officer, April 2023 to present. Privately held; approximately $4 billion in cumulative equity funding plus $830 million in debt. Per the Bloomberg Billionaires Index in September 2025, Lacroix's stake values his net worth at approximately $1.1 billion alongside his co-founders.

No public personal angel-investor activity is on record in AI, semiconductors, datacenters, software, or energy outside the founder-and-operator role at Mistral AI as of May 2026.

Network

Lacroix's longest-running professional relationships are with his Mistral co-founders Arthur Mensch (Chief Executive Officer) and Guillaume Lample (Chief Science Officer). Lample was the FAIR Paris colleague through whom the technical collaboration developed across the LLaMA series and the theorem-proving research line. His PhD advisor was Guillaume Obozinski at École des Ponts ParisTech and Université Paris-Est, with Nicolas Usunier at FAIR as the primary co-author on his doctoral tensor-decomposition publications.

The LLaMA-paper byline links him to Hugo Touvron as first author and to several authors who later followed the founders to Mistral, including Marie-Anne Lachaux and Thibaut Lavril. Yann LeCun was the chief AI scientist of Meta during his FAIR period. Among other former FAIR Paris colleagues, Tim Rocktäschel shared the office during the 2018 to 2020 window before leaving for Google DeepMind.

Among broader frontier-research peers, the Mistral founding period has placed Lacroix alongside Sam Altman of OpenAI, Dario Amodei of Anthropic, Demis Hassabis of Google DeepMind, and Aidan Gomez of Cohere on conference panels and partner-event circuits, though his public-facing presence is the lowest-profile of the three Mistral co-founders.

Position in the field

As of May 2026, Lacroix is the chief technology officer of the highest-valued European frontier AI lab and a co-author on the foundational open-weights large-language-model paper from his prior employer. The engineering-focused trajectory through FAIR Paris distinguishes his profile from research-leadership cofounders elsewhere in the frontier set.

Press coverage has consistently identified the three Mistral co-founders as filling complementary roles. Mensch is the public face on funding, EU AI policy, and capability claims; Lample carries technical leadership on model research and training methodology; Lacroix focuses on engineering systems and the inference stack that turns research artifacts into production releases. The conference circuit reflects the split: Lacroix's public appearances cluster on engineering-track keynotes and infrastructure-focused podcasts including the October 2023 MLOps Community talk on LLM inference cost and latency, the December 2023 Tech.Rocks Summit interview, and the February 2026 MAD Podcast appearance on Mistral's full-stack sovereign-AI strategy.

The artifact most often cited as evidence of his operational leadership is the cadence of Mistral's open-weights releases through 2023 and 2024. The Mixtral 8x7B launch preceded comparable production-scale MoE releases from US peers, and the Le Chat rollout reached approximately 450,000 customers and over 1,000 enterprise accounts by mid-2025.

Outlook

Open questions over the next 6 to 18 months:

  • Bruyères-le-Châtel and Sweden datacenter execution. The build-out funded by the March 2026 debt financing and the resulting training capacity for 2026 and 2027 model releases.
  • Inference-stack differentiation. Whether Mistral's production inference performance and cost continue to track or beat US peers as Lacroix-led engineering scales the platform.
  • Open-weights versus closed-weights release cadence. Whether Mistral maintains the bifurcated strategy through Mistral Large 4 and the next Ministral generation.
  • Talent pipeline. Continued migration of FAIR Paris and Polytechnique-network engineers to Mistral as the Bruyères-le-Châtel team scales.
  • Founder-team continuity. Whether the three-co-founder C-suite remains intact through the 2026 fundraising and product cycle.

Sources

About the author
Nextomoro

AI Research Lab Intelligence

nextomoro tracks progress for AI research labs, models, and what's next.

AI Research Lab Intelligence

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to AI Research Lab Intelligence.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.