Ramin Hasani
Ramin Hasani is a Persian-Austrian computer scientist and AI researcher. He is the co-founder and chief executive officer of Liquid AI, the 2023 MIT CSAIL spinout building Liquid Foundation Models on the liquid-neural-network architecture his research line developed. He is also the principal author of the Liquid Time-Constant Networks paper and the broader liquid-neural-network research thesis, and serves as a research affiliate at MIT CSAIL.
At a glance
- Education: BSc in electrical engineering (electronics), Ferdowsi University of Mashhad, 2007 to 2012; MSc in electronic engineering, Politecnico di Milano, 2012 to 2015; PhD in computer science (with distinction), Vienna University of Technology (TU Wien), May 2020. Doctoral dissertation: "Interpretable recurrent neural networks in continuous-time control environments," advised by Radu Grosu.
- Current roles: Co-founder and chief executive officer at Liquid AI (since March 2023); machine-learning research affiliate at MIT CSAIL.
- Key contributions: Liquid time-constant networks; closed-form continuous-time neural networks; the broader liquid-neural-network research line; Liquid AI's commercial Liquid Foundation Models.
- Awards: TUEV Austria Dissertation Award nomination (2020); HPC Innovation Excellence Award (2022); TED Fellow.
- X: @ramin_m_h
- LinkedIn: Ramin Hasani
- Personal site: raminhasani.com
Origins
Hasani was born in Iran and completed his secondary education there. In 2007 he enrolled at Ferdowsi University of Mashhad, one of the older Iranian research universities, where he completed a Bachelor of Science in electrical engineering with a focus on electronics in 2012. The undergraduate program established the foundation in circuit theory, control systems, and signal processing that subsequently informed his graduate research on continuous-time neural-network architectures.
In 2012 he moved to Italy for graduate study, completing a Master of Science in electronic engineering at Politecnico di Milano in December 2015. The Milan years marked his transition from electronics-and-control engineering toward computational modeling of biological neural systems, the research line that became the basis for his subsequent doctoral work.
In early 2016 Hasani moved to Vienna for doctoral study at the Vienna University of Technology under Radu Grosu, Professor of Computer Engineering at TU Wien. The Grosu research group at the Institute of Computer Engineering (E191) focuses on cyber-physical systems, model checking, and computational modeling of neural systems. The doctoral research line that Hasani pursued during the TU Wien years drew direct inspiration from the C. elegans nematode nervous system, a 302-neuron biological reference that became a central motif of the liquid-network research thesis. He defended his doctoral dissertation, "Interpretable recurrent neural networks in continuous-time control environments," at TU Wien on May 5, 2020, with the thesis recognized by a TUEV Austria Dissertation Award nomination later that year.
Career
Following the TU Wien defense in May 2020, Hasani moved to Cambridge, Massachusetts, to take up a postdoctoral position at MIT CSAIL under Daniela Rus, the lab's director and the senior collaborator on the liquid-network research line. The MIT postdoc consolidated the research collaboration that had begun during the TU Wien years through co-authorship on the original Liquid Time-Constant Networks paper, with Mathias Lechner (a fellow Grosu-trained TU Wien researcher who later completed his PhD at ISTA) and Alexander Amini (a CSAIL doctoral student under Rus).
During the postdoctoral period from 2020 to 2023, Hasani held a concurrent appointment as Principal AI and Machine Learning Scientist at the Vanguard Group, the asset-management firm headquartered in Pennsylvania. The joint MIT-Vanguard appointment provided commercial-research exposure to time-series-and-decision-making applications of liquid neural networks in financial markets, an early validation of the architectural thesis on continuous-input domains.
The MIT and Vanguard period produced the principal academic publications on the liquid-network research line, including the AAAI-21 Liquid Time-Constant Networks paper, the November 2022 Nature Machine Intelligence paper "Closed-form continuous-time neural networks," and the broader research portfolio that covers neural ordinary differential equations, continuous-time dynamics, and interpretable neural network architectures.
Liquid AI was incorporated on March 30, 2023, with Hasani as chief executive, Mathias Lechner and Alexander Amini as co-founders, and Daniela Rus as the senior co-founder and advisor. The company emerged from stealth in December 2023 with seed funding and the public introduction of the Liquid Foundation Model line. It raised a $250 million Series A in December 2024 at a $2.35 billion valuation, led by AMD, with cumulative funding through April 2026 of approximately $297 million. In April 2026 the company announced a multi-year strategic partnership with Mercedes-Benz to embed LFMs into the MBUX infotainment system across third- and fourth-generation deployments, with first North American rollout scheduled for the second half of 2026.
Throughout the Liquid AI period Hasani has retained the MIT CSAIL research-affiliate appointment and has continued public engagement on the liquid-network research thesis through TEDx talks, MIT lectures, podcast appearances, and conference keynotes.
Affiliations
- Vienna University of Technology (TU Wien): Doctoral researcher, Institute of Computer Engineering (E191), 2016 to 2020
- MIT CSAIL: Postdoctoral associate, 2020 to 2023; Research affiliate, 2023 to present
- Vanguard Group: Principal AI and Machine Learning Scientist (concurrent), 2020 to 2023
- Liquid AI: Co-founder and chief executive officer, March 2023 to present
Notable contributions
- Liquid Time-Constant Networks. The foundational paper "Liquid Time-Constant Networks" (Hasani, Lechner, Amini, Rus, Grosu), first posted to arXiv in June 2020 and accepted at AAAI-21, introduced the LTC neural-network family. The architecture constructs networks of linear first-order dynamical systems modulated through nonlinear interlinked gates, with continuous-time dynamics that adapt internal time constants as a function of input. LTCs became the architectural basis for the broader liquid-network research line and for Liquid AI's commercial Liquid Foundation Models.
- Closed-form continuous-time neural networks. The 2022 Nature Machine Intelligence paper "Closed-form continuous-time neural networks" (Hasani et al., Nature Machine Intelligence 4, 992 to 1003) demonstrates a closed-form approximation of the LTC integral, producing models that train and run between one and five orders of magnitude faster than equivalent differential-equation-based architectures. The CfC paper extended the practical reach of the liquid-network thesis from research demonstrations toward production-scale deployments.
- Liquid neural networks research line. The broader research portfolio including neural circuit policies, ordinary neural circuits, Liquid Structural State-Space Models, and the C. elegans-inspired computational-neuroscience research that motivated the architectural family. The portfolio anchors Hasani's reputation as the principal architect of the liquid-network research thesis.
- Liquid Foundation Models. The September 2024 LFM model family with 1.3 billion, 7 billion, and 40 billion parameter variants, the first commercial-scale liquid-network models, distributed in part as open weights on Hugging Face. The April 2026 Mercedes-Benz partnership is the most prominent commercial reference for LFM deployment.
- TEDx talks. Hasani has delivered multiple TEDx talks on the liquid-network thesis, including "A Journey inside a Neural Network" at TEDxCluj (2019), "Liquid Neural Networks" at TEDxMIT (2023), "Generalist Artificial Intelligence Not Yet AGI" at TEDxBoston (2023), and "Adaptable Aviators" at TEDxMIT Salon (2023). The TEDxMIT 2023 talk is the canonical public account of the liquid-network architectural thesis.
Investments and boards
The entries below are limited to AI, semiconductors, datacenters, software, and energy.
- Liquid AI (AI): Co-founder and chief executive officer, March 2023 to present. MIT CSAIL spinout developing Liquid Foundation Models. $250 million Series A in December 2024 at a $2.35 billion valuation led by AMD. Cumulative funding approximately $297 million through April 2026.
No other public investor activity on record in AI, semiconductors, datacenters, software, or energy as of May 2026.
Network
Hasani's principal long-running professional relationship is with Radu Grosu, his TU Wien doctoral advisor and a co-author on the original Liquid Time-Constant Networks paper and subsequent liquid-network research. The MIT-era research collaboration with Daniela Rus, his postdoctoral supervisor and the senior co-founder of Liquid AI, is the second principal relationship and the academic-and-commercial throughline of his career. Mathias Lechner, a fellow Grosu-trained TU Wien researcher who later completed his PhD at ISTA and a postdoc at MIT CSAIL before joining as Liquid AI's chief technology officer, and Alexander Amini, Rus's MIT doctoral student, are his Liquid AI co-founders and longest-running research collaborators. The broader Liquid AI research-and-engineering team, recruited primarily from MIT CSAIL, the broader Boston-area academic network, and AI-research labs more generally, forms the immediate professional network. AMD's senior leadership, the lead Series A investor, is the principal commercial-investor relationship.
Position in the field
Hasani occupies a structurally distinctive position among insurgent AI lab founders through the combination of a well-defined architectural research thesis (liquid neural networks), the academic credentials from TU Wien and MIT CSAIL, the senior co-founder relationship with Daniela Rus, and the commercial trajectory of Liquid AI from 2023 spinout to a $2.35 billion Series A valuation in eighteen months. The architectural thesis, that liquid-network architectures with continuous-time dynamics produce a parameter-efficiency advantage over transformer-family models on continuous-input and time-series domains, is a research bet that distinguishes Liquid AI from the parameter-scaling consensus that has dominated frontier AI through 2025.
Industry coverage has frequently characterized Hasani as the public face of the liquid-network architectural thesis, including through the IEEE Spectrum coverage of "Liquid Neural Networks Adapt on the Go," the November 2022 MIT News coverage of "Solving brain dynamics gives rise to flexible machine-learning models," the Mercedes-Benz commercial validation of the on-device deployment thesis, and the TEDx talk circuit. Among the broader cohort of insurgent-AI founders with academic-research-thesis-and-commercial-spinout profiles, Hasani's closest comparators are Aidan Gomez at Cohere, the transformer co-author who founded the enterprise AI lab on the basis of the original Attention research, and Arthur Mensch at Mistral AI, the former DeepMind researcher whose academic credentials anchored the European AI champion company. Among MIT-spinout founders specifically, the Liquid AI cohort sits alongside Joshua Tenenbaum's research group and other CSAIL-spinout activity, with the Daniela Rus academic-leadership-and-commercial-co-founder relationship as the most prominent operating template.
The architectural thesis remains contested in the broader research community on the question of whether liquid-network architectures scale to frontier-tier capability or remain confined to smaller-parameter and edge-deployment market segments. The Liquid AI commercial trajectory through 2026 and 2027, the Mercedes-Benz rollout in the second half of 2026, and the LFM successor model releases will provide the principal evidence on the architectural-scaling question.
Outlook
Open questions over the next 6 to 18 months:
- Liquid AI commercial scaling. Whether the Mercedes-Benz MBUX rollout in the second half of 2026 and any subsequent automotive, industrial, or consumer-electronics partnerships produce the commercial validation that converts the architectural thesis into a durable edge-AI commercial business.
- LFM successor models. Whether the next-generation Liquid Foundation Model line scales the architectural-efficiency advantage toward larger parameter classes that compete more directly with transformer-family flagship models from frontier labs.
- Series B financing. Whether Liquid AI raises a follow-on round in 2026 or 2027 and at what valuation, given the December 2024 Series A at $2.35 billion and the cumulative-funding total of approximately $297 million.
- Public-engagement role. Whether Hasani continues the TEDx talks, podcast circuit, and conference keynote cadence that has anchored the public-facing role through 2024 and 2025, or whether the chief-executive responsibilities reduce the public-engagement bandwidth.
- Architectural-thesis validation. Whether the broader research community converges on the liquid-network architectural-efficiency thesis or whether competitive transformer-family efficient-inference research (including the Phi family from Microsoft AI, the Gemma family from Google DeepMind, and the Ministral series from Mistral AI) closes the architectural gap.
Sources
- Ramin Hasani official website. Personal site with research-area summaries, publication list, and TEDx-talk archive.
- Ramin Hasani | Liquid AI. Liquid AI team page with current title and biographical summary.
- Ramin Hasani | MIT CSAIL. MIT CSAIL profile page.
- Interpretable recurrent neural networks in continuous-time control environments. TU Wien doctoral-dissertation record (May 2020), advised by Radu Grosu.
- Liquid Time-Constant Networks. The foundational LTC paper (Hasani, Lechner, Amini, Rus, Grosu), arXiv 2006.04439, accepted to AAAI-21.
- Closed-form continuous-time neural networks. 2022 Nature Machine Intelligence paper on the closed-form approximation of LTCs.
- "Liquid" machine-learning system adapts to changing conditions. January 2021 MIT News coverage of the liquid-network research line.
- Solving brain dynamics gives rise to flexible machine-learning models. November 2022 MIT News coverage of the closed-form continuous-time research.
- "Liquid" Neural Network Adapts on the Go. IEEE Spectrum coverage of the liquid-network architectural thesis.
- Liquid Neural Networks | Ramin Hasani | TEDxMIT. January 2023 TEDxMIT talk; the canonical public account of the liquid-network thesis.
- Liquid AI: We raised $250M to scale capable and efficient general-purpose AI. December 2024 Liquid AI Series A announcement.
- Mercedes-Benz and Liquid AI Partner to Scale Embedded In-Car Intelligence in North America. April 2026 partnership announcement.
- Photo: Liquid AI team page, Liquid AI press portrait.