Mathias Lechner

Mathias Lechner is an Austrian computer scientist, co-founder and chief technology officer of Liquid AI, and a research affiliate at MIT CSAIL working on liquid neural networks and verifiable machine learning.
Mathias Lechner

Mathias Lechner

Mathias Lechner is an Austrian computer scientist and AI researcher, co-founder and chief technology officer of Liquid AI, the 2023 MIT CSAIL spinout building Liquid Foundation Models on the liquid-neural-network architecture. He is a co-author on the foundational Liquid Time-Constant Networks paper that anchors the company's research thesis, and a research affiliate at MIT CSAIL collaborating with Daniela Rus on robust and verifiable machine learning.

At a glance

  • Education: BSc in computer science, Vienna University of Technology (TU Wien), 2016; MSc in computer science, TU Wien, 2017, master's thesis "Brain-inspired Neural Control" advised by Radu Grosu; PhD in computer science, Institute of Science and Technology Austria (ISTA), May 2022. Doctoral dissertation: "Learning Verifiable Representations," advised by Thomas A. Henzinger.
  • Current roles: Co-founder and chief technology officer at Liquid AI (since 2023); research affiliate at MIT CSAIL.
  • Key contributions: Liquid time-constant networks; closed-form continuous-time neural networks; neural circuit policies; verifiable and certifiable neural network research; Liquid AI's Liquid Foundation Models.
  • Awards: TU Wien Distinguished Young Alumnus Award (2018); F1-Tenth Autonomous Racing Grand Prix, IFAC 2020 (team co-lead); Outstanding Reviewer, ICRA (2021); Hyperion Research HPC Innovation Excellence Award (2022, with Ramin Hasani); ISTA Outstanding Scientific Achievement Award (2023, with Djordje Zikelic).
  • X: @mlech26l
  • LinkedIn: Mathias Lechner
  • GitHub: mlech26l
  • Personal site: mlech26l.github.io
  • Substack: mlechner.substack.com

Origins

Lechner is Austrian and entered the Vienna University of Technology for undergraduate study, completing a Bachelor of Science in computer science in 2016 and a Master of Science in 2017. The master's research, titled "Brain-inspired Neural Control," was conducted at the Cyber-Physical Systems Group within the Institute of Computer Engineering (E191) under Radu Grosu, Professor of Computer Engineering at TU Wien. The thesis introduced the research line on biologically-inspired continuous-time neural-network architectures that became the basis for the broader liquid-network program, and was recognized by the Distinguished Young Alumnus Award of TU Wien in January 2018.

The TU Wien years also produced Lechner's first peer-reviewed publication, "Worm-level Control through Search-based Reinforcement Learning," presented at the Deep Reinforcement Learning Symposium at NeurIPS 2017 with co-authors Ramin Hasani and Radu Grosu. The Grosu group drew direct inspiration from the C. elegans nematode nervous system, a 302-neuron biological reference that became a recurring motif of the liquid-network research line.

Career

In 2018 Lechner moved from TU Wien to the Institute of Science and Technology Austria (ISTA) for doctoral study, joining the research group of Thomas A. Henzinger, the Austrian-Swiss computer scientist and former founding president of ISTA. The Henzinger group works at the intersection of formal methods, computer-aided verification, and embedded systems, and Lechner's doctoral research extended these methods into the verification of neural-network controllers and learned representations. The doctoral period produced publications on quantized neural networks, neural ODE verification, scalable verification methods, and the broader research portfolio that anchored the verifiable-machine-learning thesis.

The ISTA years overlapped with the formation of the broader liquid-network research collaboration. The June 2020 Liquid Time-Constant Networks paper, with co-authors Hasani, Lechner, Alexander Amini, Daniela Rus, and Radu Grosu, established the LTC neural-network family that became the architectural basis for both the academic research line and the subsequent commercial spinout. The paper was accepted at AAAI-21. Lechner also co-led the F1-Tenth Autonomous Racing Grand Prix team that won the IFAC 2020 competition, an early application of liquid-network architectures to closed-loop robotic control.

He defended his doctoral dissertation, "Learning Verifiable Representations," at ISTA in May 2022. The thesis was recognized in 2023 by the ISTA Outstanding Scientific Achievement Award, jointly with Djordje Zikelic, for research on safety-proof methods for stochastic machine learning systems.

Following the May 2022 defense Lechner moved to Cambridge, Massachusetts, for a postdoctoral position at MIT CSAIL under Daniela Rus. The MIT postdoc consolidated the research collaboration that had begun during the TU Wien and ISTA years through the LTC paper and continued through the November 2022 Nature Machine Intelligence paper "Closed-form continuous-time neural networks" (Hasani, Lechner, Amini, Rus). The CfC paper demonstrated a closed-form approximation of the LTC integral that produced models running between one and five orders of magnitude faster than equivalent differential-equation-based architectures.

Liquid AI was incorporated in 2023 with Hasani as chief executive, Lechner as chief technology officer, Amini as co-founder, and Rus as senior co-founder and advisor. The company emerged from stealth in December 2023 and raised a $250 million Series A in December 2024 at a $2.35 billion valuation, led by AMD, with cumulative funding through April 2026 of approximately $297 million. In April 2026 Liquid AI announced a multi-year strategic partnership with Mercedes-Benz to embed LFMs into the MBUX infotainment system across third- and fourth-generation deployments, with first North American rollout scheduled for the second half of 2026.

Throughout the Liquid AI period Lechner has retained the MIT CSAIL research-affiliate appointment and has continued to publish on liquid networks, neural-network verification, dataset distillation, and stochastic-control methods at ICLR, NeurIPS, ICML, ICRA, AAAI, and Science Robotics.

Affiliations

Notable contributions

  • Liquid Time-Constant Networks. "Liquid Time-Constant Networks" (Hasani, Lechner, Amini, Rus, Grosu), first posted to arXiv in June 2020 and accepted at AAAI-21, introduced the LTC neural-network family. The architecture constructs networks of linear first-order dynamical systems modulated through nonlinear interlinked gates, with continuous-time dynamics that adapt internal time constants as a function of input. LTCs became the architectural basis for the broader liquid-network research line and for Liquid AI's commercial Liquid Foundation Models.
  • Closed-form continuous-time neural networks. The November 2022 Nature Machine Intelligence paper "Closed-form continuous-time neural networks" (Hasani, Lechner, Amini, Rus, Nature Machine Intelligence 4, 992 to 1003) demonstrated a closed-form approximation of the LTC integral, producing models that train and run between one and five orders of magnitude faster than equivalent differential-equation-based architectures.
  • Neural circuit policies. "Neural circuit policies enabling auditable autonomy" (Lechner, Hasani, Amini, Henzinger, Grosu, Rus, Nature Machine Intelligence 2, 642 to 652), published in October 2020, introduced the worm-inspired NCP architecture that demonstrated end-to-end autonomous-driving control with two orders of magnitude fewer neurons than comparable transformer-and-CNN approaches. NCPs became one of the most-cited applications of the liquid-network line.
  • Verifiable machine learning. Doctoral and postdoctoral research on neural-network verification, including "Scalable Verification of Quantized Neural Networks" (AAAI 2021), "How Many Bits Does it Take to Quantize Your Neural Network?" (TACAS 2020), and "On the Verification of Neural ODEs with Stochastic Guarantees" (AAAI 2021). The portfolio anchored the May 2022 doctoral dissertation "Learning Verifiable Representations."
  • Robust flight navigation with liquid neural networks. "Robust flight navigation out of distribution with liquid neural networks" (Vorbach, Hasani, Amini, Lechner, Rus, Science Robotics, April 2023), demonstrating that liquid neural networks could pilot a quadrotor through unfamiliar visual environments, including extreme weather and out-of-distribution scenes, with substantially fewer parameters than transformer-family alternatives.
  • TEDxMIT Salon talk. "Attacking Artificial Intelligence" (TEDxMIT Salon, May 2023), Lechner's public account of adversarial attacks on neural networks and the case for verifiable representations as a defense.

Investments and boards

The entries below are limited to AI, semiconductors, datacenters, software, and energy.

  • Liquid AI (AI): Co-founder and chief technology officer, 2023 to present. MIT CSAIL spinout developing Liquid Foundation Models. $250 million Series A in December 2024 at a $2.35 billion valuation led by AMD. Cumulative funding approximately $297 million through April 2026.

No other public investor activity on record in AI, semiconductors, datacenters, software, or energy as of May 2026.

Network

Lechner's longest-running professional relationships are with Radu Grosu, his TU Wien master's-thesis advisor and a co-author on the original LTC paper, and Thomas A. Henzinger, his ISTA doctoral advisor and the principal collaborator on the verifiable-machine-learning research line. The MIT-era collaboration with Daniela Rus, his postdoctoral supervisor and the senior co-founder of Liquid AI, bridges the academic and commercial periods. Ramin Hasani, a fellow Grosu-trained TU Wien researcher and now Liquid AI's chief executive officer, is the longest-running research collaborator: the Lechner-Hasani co-authorship dates to the 2017 NeurIPS workshop paper and runs through every major liquid-network publication. Alexander Amini, Rus's MIT doctoral student and a Liquid AI co-founder, is the third member of the founding team. AMD's senior leadership, the lead Series A investor, is the principal commercial-investor relationship.

Position in the field

Lechner occupies a structurally distinctive position among insurgent AI lab co-founders through the combination of a focused architectural research thesis (liquid neural networks), the formal-verification credentials from the Henzinger Group at ISTA, the postdoctoral collaboration with Daniela Rus at MIT CSAIL, and the commercial trajectory of Liquid AI from 2023 spinout to a $2.35 billion Series A valuation in eighteen months. The combination of the architectural-research line and the verifiable-machine-learning research line is unusual: most insurgent-AI co-founders have either a model-architecture or a safety-and-verification background, and the dual profile distinguishes Lechner from the broader cohort.

Industry coverage has frequently characterized Lechner as the technical co-architect of the liquid-network thesis alongside Hasani, with the CTO role at Liquid AI consolidating the research-and-engineering responsibility for the LFM model line. Among MIT CSAIL spinout co-founders specifically, the Liquid AI founding team's split between Hasani as CEO and Lechner as CTO is the operating template, with Amini as the autonomous-systems and product co-founder and Rus as the senior academic advisor.

The architectural thesis remains contested in the broader research community on the question of whether liquid-network architectures scale to frontier-tier capability or remain confined to smaller-parameter and edge-deployment market segments. The Liquid AI commercial trajectory through 2026 and 2027, the Mercedes-Benz rollout in the second half of 2026, and the LFM successor model releases will provide the principal evidence on the architectural-scaling question, and the engineering execution falls primarily to the CTO role.

Outlook

Open questions over the next 6 to 18 months:

  • Liquid AI engineering execution. Whether the Mercedes-Benz MBUX rollout in the second half of 2026 and any subsequent automotive or industrial partnerships produce the engineering validation that converts the architectural thesis into a durable edge-AI business under Lechner's CTO leadership.
  • LFM successor models. Whether the next-generation Liquid Foundation Model line scales the architectural-efficiency advantage toward larger parameter classes that compete more directly with transformer-family flagship models.
  • Verifiable AI integration. Whether the verifiable-machine-learning research line that anchored the ISTA dissertation produces commercial features in the LFM line, particularly for safety-critical edge-AI deployment segments where formal-verification methods have direct product value.
  • Public-research cadence. Whether Lechner continues the conference and publication cadence of the postdoctoral and Liquid AI period, or whether the CTO responsibilities reduce the public-research bandwidth.
  • Architectural-thesis validation. Whether the broader research community converges on the liquid-network efficiency thesis or whether competitive transformer-family efficient-inference research closes the architectural gap.

Sources

About the author
Nextomoro

Nextomoro

nextomoro tracks progress for AI research labs, models, and what's next.

AI Research Lab Intelligence

nextomoro tracks progress for AI research labs, models, and what's next.

AI Research Lab Intelligence

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to AI Research Lab Intelligence.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.