CAMBRIDGE, Mass., Nov. 13, 2025 (GLOBE NEWSWIRE) — Liquid AI today announced a multiâ€'faceted partnership with Shopify to license and deploy Liquid AI's flagship Liquid Foundation Models (LFMs) across qualityâ€'sensitive workflows on Shopify's platform, including search and other multimodal use cases where quality and latency matter. The first production deployment is a subâ€'20ms text model that enhances search. The agreement follows Shopify's participation in Liquid AI's $250 million Series A round in December 2024, and formalizes deep coâ€'development already underway between the companies.
As part of the partnership, Shopify and Liquid have codeveloped a generative recommender system with a novel HSTU architecture. In controlled testing, the model has proven to outperform prior stack, resulting in higher conversion rates from recommendations.
Ramin Hasani, Liquid AI CEO:
“Recommendation is the backbone of decisionâ€'making in finance, healthcare, and eâ€'commerce. To be useful in the real world, models must be reliable, efficient, and fast. Shopify has been an ideal partner to validate that at scale. We're excited to bring Liquid Foundation Models to millions of shoppers and merchants and to show how efficient ML translates into measurable value in everyday experiences.”
Liquid's LFMs are designed for subâ€'20 millisecond, multimodal, qualityâ€'preserving inference. On specific productionâ€'like tasks, LFMs with ~50% fewer parameters have outperformed popular openâ€'source models such as Qwen3, Gemma3, and Llama 3, while delivering 2-10× faster inference, enabling realâ€'time shopping experiences at platform scale.
Mikhail Parakhin, Shopify CTO:
“I've seen a lot of models. No one else is delivering subâ€'20ms inference on real workloads like this. Liquid's architecture is efficient without sacrificing quality; in some use cases, a model with ~50% fewer parameters beats Alibaba Qwen, and Google Gemma, and still runs 2-10× faster. That's what it takes to power interactive commerce at scale.”
Mathias Lechner, Liquid AI CTO:
“We design Liquid Foundation Models with an intertwined objective function that maximizes the quality while making the system on the hardware of choice the fastest on the market. This makes them a natural fit for applications in e-commerce, such as personalized ranking, retrievalâ€'augmented generation, and sessionâ€'aware recommendations, all under tight latency and cost budgets for delivering the best user experience. In Shopify's environment, we've focused on production robustness, from lowâ€'variance tail latency to safety and drift monitoring.”
The partnership includes a multiâ€'purpose license for LFMs across lowâ€'latency, qualityâ€'sensitive Shopify workloads, ongoing R&D collaboration, and a shared roadmap. While today's deployment is a subâ€'20ms text model for search, the companies are evaluating multimodal models for additional products and use cases, including customer profiles, agents, and product classification. Financial terms are not disclosed.
About Liquid AI
Liquid AI builds Liquid foundation models (LFMs)–multimodal, efficient models engineered for realâ€'time, reliabilityâ€'critical applications. Founded by researchers behind liquid neural networks, Liquid AI focuses on lowâ€'latency inference, resource efficiency, and productionâ€'grade safety.
Press Contact
Rachel Gordon
rg@liquid.ai
A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/1a6bcd9c-934a-4db5-9dcd-f64d86f82fe6
