Timothée Lacroix
Timothée Lacroix is a French computer scientist, born in 1991. He is the co-founder and Chief Technology Officer of Mistral AI, the Paris-headquartered foundation-model company he established in April 2023 with Arthur Mensch and Guillaume Lample, and a co-author on the February 2023 LLaMA paper at Meta AI FAIR Paris. As of May 2026, he leads engineering and infrastructure at Mistral following the September 2025 €2 billion Series C at a €12 billion valuation led by ASML.
At a glance
- Education: Student of École Normale Supérieure (rue d'Ulm), admitted in the 2011 class for computer science; exchange period at the University of California, San Diego in 2013; PhD in computer science at École des Ponts ParisTech and Université Paris-Est, IMAGINE team at LIGM, 2016 to 2020, advised by Guillaume Obozinski.
- Current role: Co-founder and Chief Technology Officer of Mistral AI since April 2023.
- Key contributions: co-author of the LLaMA paper (February 2023); first author of Canonical Tensor Decomposition for Knowledge Base Completion (ICML 2018); engineering and infrastructure leadership on Mistral 7B, Mixtral 8x7B, and the production training pipeline.
- X / Twitter: @tlacroix6
- LinkedIn: timothee-lacroix-59517977
- Google Scholar: tZGS6dIAAAAJ
- OpenReview: Timothee_Lacroix1
Origins
Lacroix was born in 1991 in France. He was admitted to École Normale Supérieure in Paris (the rue d'Ulm campus) in the 2011 class through the standard entrance examination, with the admission recorded in the December 13, 2011 JORF decree. He studied computer science at ENS through 2015 and spent an exchange period at the University of California, San Diego in 2013.
In 2014, while still a student, Lacroix joined Facebook in London as a software engineering intern on the Graph Search team, the in-product social search feature the company was developing at the time. He continued in 2015 at the Facebook AI Research (FAIR) office between New York and Paris as a Research Engineer. The progression from product engineering to research-engineering inside the same employer foreshadowed the engineering-focused profile he carried into Mistral.
Career
Lacroix began doctoral studies in 2016 at École des Ponts ParisTech and Université Paris-Est, in the IMAGINE team of the Laboratoire d'Informatique Gaspard-Monge (LIGM), advised by Guillaume Obozinski. The thesis line, in collaboration with Nicolas Usunier at FAIR, focused on tensor-decomposition methods for link prediction and knowledge-base completion. The thesis "Décompositions tensorielles pour la complétion de bases de connaissance" was deposited in 2020 (thesis number 2020PESC1002). The doctoral work produced two of his most-cited research papers: Canonical Tensor Decomposition for Knowledge Base Completion at ICML 2018 and Tensor Decompositions for Temporal Knowledge Base Completion at ICLR 2020.
Through and after the PhD, Lacroix continued as a Research Engineer at FAIR Paris, working on tooling and infrastructure for large-scale machine-learning research. He was a co-author on the 2016 TorchCraft library for reinforcement-learning research on real-time strategy games, and on the 2019 PyTorch-BigGraph large-scale graph-embedding system. The graph-embedding work reflected the engineering posture he carried through his FAIR period: building the systems other research teams trained models on top of.
The artifact most widely associated with Lacroix is the LLaMA paper, submitted to arXiv on February 27, 2023 under the title "LLaMA: Open and Efficient Foundation Language Models". The 14-author byline lists Lacroix sixth, behind first author Hugo Touvron, Thibaut Lavril, Gautier Izacard, Xavier Martinet, and Marie-Anne Lachaux, and ahead of Baptiste Rozière. Guillaume Lample is the last-listed senior author. The paper introduced the foundation-model family in 7B, 13B, 33B, and 65B parameter sizes and triggered the wave of open-weights frontier-class language-model releases that followed through 2023 and 2024.
In April 2023, Lacroix left Meta to co-found Mistral AI with Arthur Mensch and Guillaume Lample, taking the Chief Technology Officer title and engineering-and-infrastructure leadership. The €105 million seed round in June 2023, led by Lightspeed Venture Partners with Eric Schmidt, Xavier Niel, and JCDecaux, was among the largest seed rounds in European technology history at the time. Mistral 7B launched on September 27, 2023 as an Apache 2.0 open-weights release, followed by Mixtral 8x7B on December 9, 2023, the first widely deployed sparse mixture-of-experts model from a frontier-class lab.
The Mistral Large flagship line launched in February 2024 alongside the Le Chat consumer assistant. The €600 million Series B closed in June 2024, followed by Mistral Large 2 in July 2024 and the Codestral, Mathstral, and Pixtral lines later in the year. The €2 billion Series C in September 2025 was led by ASML at a €12 billion valuation, and Mistral Large 3 and the Ministral 3 family followed in December 2025, with an $830 million debt financing closing in March 2026 for the Bruyères-le-Châtel facility near Paris and a Sweden site. Lacroix's role across the period has been engineering leadership for the training and inference stack that turns research models into production releases.
Affiliations
- École Normale Supérieure (rue d'Ulm): Student in computer science, admitted 2011 class.
- Facebook (London): Software engineering intern, Graph Search team, 2014.
- Facebook AI Research and Meta AI: Research Engineer, 2015 to April 2023, including the doctoral period 2016 to 2020.
- École des Ponts ParisTech and Université Paris-Est, IMAGINE team at LIGM: PhD candidate, 2016 to 2020.
- Mistral AI: Co-founder and Chief Technology Officer, April 2023 to present.
Notable contributions
Lacroix's published record runs from research-infrastructure tooling at FAIR to the tensor-decomposition and knowledge-base-completion line at École des Ponts, to the LLaMA family, to the Mistral and Mixtral models. His Google Scholar profile lists approximately 34,000 citations and an h-index of 22 as of mid-2026.
- LLaMA: Open and Efficient Foundation Language Models (February 2023). Sixth-listed author on the 14-author paper introducing the LLaMA model family. The paper is the most-cited artifact in his record at over 27,000 citations and triggered the wave of open-weights frontier-class language-model releases.
- Mistral AI founding (April 2023). Co-founder and Chief Technology Officer.
- Mistral 7B (September 2023). Foundational open-weights release under Apache 2.0; co-author on the Mistral 7B technical paper.
- Mixtral 8x7B (December 2023). First widely deployed sparse mixture-of-experts model from a frontier-class lab.
- Mistral Large family (February 2024 to present). Closed-weights flagship line, including Mistral Large 2 (July 2024) and Mistral Large 3 (December 2025).
- Canonical Tensor Decomposition for Knowledge Base Completion (ICML 2018). First-authored doctoral paper introducing a regularizer based on tensor nuclear p-norms and a problem reformulation that achieved state-of-the-art results on knowledge-base-completion benchmarks.
- Tensor Decompositions for Temporal Knowledge Base Completion (ICLR 2020). Extension of the canonical-decomposition line to time-varying knowledge graphs; a primary chapter of his doctoral thesis.
- PyTorch-BigGraph: A Large-scale Graph Embedding System (SysML 2019). FAIR system paper on distributed graph-embedding training over large knowledge bases, with Adam Lerer and others.
- TorchCraft: a Library for Machine Learning Research on Real-Time Strategy Games (2016). FAIR research-tooling paper on a StarCraft research environment.
- Theorem-proving line (2022 to 2023). Co-author with Guillaume Lample on HyperTree Proof Search for Neural Theorem Proving (NeurIPS 2022) and Draft, Sketch, and Prove (ICLR 2023, top-5% notable).
Investments and boards
- Mistral AI (AI): Co-founder and Chief Technology Officer, April 2023 to present. Privately held; approximately $4 billion in cumulative equity funding plus $830 million in debt. Per the Bloomberg Billionaires Index in September 2025, Lacroix's stake values his net worth at approximately $1.1 billion alongside his co-founders.
No public personal angel-investor activity is on record in AI, semiconductors, datacenters, software, or energy outside the founder-and-operator role at Mistral AI as of May 2026.
Network
Lacroix's longest-running professional relationships are with his Mistral co-founders Arthur Mensch (Chief Executive Officer) and Guillaume Lample (Chief Science Officer). Lample was the FAIR Paris colleague through whom the technical collaboration developed across the LLaMA series and the theorem-proving research line. His PhD advisor was Guillaume Obozinski at École des Ponts ParisTech and Université Paris-Est, with Nicolas Usunier at FAIR as the primary co-author on his doctoral tensor-decomposition publications.
The LLaMA-paper byline links him to Hugo Touvron as first author and to several authors who later followed the founders to Mistral, including Marie-Anne Lachaux and Thibaut Lavril. Yann LeCun was the chief AI scientist of Meta during his FAIR period. Among other former FAIR Paris colleagues, Tim Rocktäschel shared the office during the 2018 to 2020 window before leaving for Google DeepMind.
Among broader frontier-research peers, the Mistral founding period has placed Lacroix alongside Sam Altman of OpenAI, Dario Amodei of Anthropic, Demis Hassabis of Google DeepMind, and Aidan Gomez of Cohere on conference panels and partner-event circuits, though his public-facing presence is the lowest-profile of the three Mistral co-founders.
Position in the field
As of May 2026, Lacroix is the chief technology officer of the highest-valued European frontier AI lab and a co-author on the foundational open-weights large-language-model paper from his prior employer. The engineering-focused trajectory through FAIR Paris distinguishes his profile from research-leadership cofounders elsewhere in the frontier set.
Press coverage has consistently identified the three Mistral co-founders as filling complementary roles. Mensch is the public face on funding, EU AI policy, and capability claims; Lample carries technical leadership on model research and training methodology; Lacroix focuses on engineering systems and the inference stack that turns research artifacts into production releases. The conference circuit reflects the split: Lacroix's public appearances cluster on engineering-track keynotes and infrastructure-focused podcasts including the October 2023 MLOps Community talk on LLM inference cost and latency, the December 2023 Tech.Rocks Summit interview, and the February 2026 MAD Podcast appearance on Mistral's full-stack sovereign-AI strategy.
The artifact most often cited as evidence of his operational leadership is the cadence of Mistral's open-weights releases through 2023 and 2024. The Mixtral 8x7B launch preceded comparable production-scale MoE releases from US peers, and the Le Chat rollout reached approximately 450,000 customers and over 1,000 enterprise accounts by mid-2025.
Outlook
Open questions over the next 6 to 18 months:
- Bruyères-le-Châtel and Sweden datacenter execution. The build-out funded by the March 2026 debt financing and the resulting training capacity for 2026 and 2027 model releases.
- Inference-stack differentiation. Whether Mistral's production inference performance and cost continue to track or beat US peers as Lacroix-led engineering scales the platform.
- Open-weights versus closed-weights release cadence. Whether Mistral maintains the bifurcated strategy through Mistral Large 4 and the next Ministral generation.
- Talent pipeline. Continued migration of FAIR Paris and Polytechnique-network engineers to Mistral as the Bruyères-le-Châtel team scales.
- Founder-team continuity. Whether the three-co-founder C-suite remains intact through the 2026 fundraising and product cycle.
Sources
- Timothée Lacroix's Google Scholar profile. Citation metrics, h-index, and chronological publication record.
- Timothée Lacroix on OpenReview. Career and education record covering the FAIR PhD-student and Research Engineer roles, with Guillaume Obozinski listed as PhD advisor.
- LLaMA: Open and Efficient Foundation Language Models. The February 2023 Meta AI paper introducing the LLaMA model family, with Lacroix as the sixth-listed author.
- Décompositions tensorielles pour la complétion de bases de connaissance. Lacroix's 2020 PhD thesis at École des Ponts ParisTech and Université Paris-Est, IMAGINE team at LIGM.
- Canonical Tensor Decomposition for Knowledge Base Completion. The first-authored ICML 2018 paper from the doctoral period.
- Tensor Decompositions for Temporal Knowledge Base Completion. The 2020 ICLR paper extending the canonical-decomposition work to time-varying graphs.
- PyTorch-BigGraph: A Large-scale Graph Embedding System. The 2019 SysML paper on distributed graph-embedding training, a primary FAIR-period research-engineering artifact.
- Mistral 7B. The September 2023 Mistral AI announcement of the foundational open-weights release.
- Mixtral of Experts. The December 2023 Mistral AI announcement of Mixtral 8x7B.
- Mistral 7B paper. The October 2023 Mistral 7B technical paper.
- Mistral AI, start-up cofondée par deux X2011 et un normalien, lève 385 millions d'euros. École Polytechnique alumni publication identifying Lacroix as the normalien co-founder, distinct from the two X2011 alumni.
- Arrêté du 13 décembre 2011 portant nomination d'élèves à l'Ecole normale supérieure (session 2011). The Légifrance government decree recording Lacroix's admission to ENS Paris in the 2011 class.
- Exploring the Latency/Throughput & Cost Space for LLM Inference. October 2023 MLOps Community talk by Lacroix on LLM inference performance and economics.
- Exceptional interview with Timothée Lacroix, CTO of Mistral by Charles Gorintin. December 2023 Tech.Rocks Summit interview.
- Mistral AI vs. Silicon Valley: The Rise of Sovereign AI. February 2026 MAD Podcast with Matt Turck appearance, Lacroix's first US podcast appearance.
- Mistrals 3 founders Timothee Lacroix, Arthur Mensch and Guillaume Lample become first AI billionaires in France. Crain Currency coverage of the founders' billionaire status per the Bloomberg Billionaires Index.
- Mistral AI. Wikipedia entry on the company, including funding rounds, model releases, and partnership history.