The Technology

The Large Interaction Model (LIM) was born from $35 million in DARPA-funded research led by Ben Vigoda — the scientist who invented the tensor processing unit (TPU) before it became synonymous with Google's AI infrastructure. The question behind the research: what if AI could learn from what people _do_, not just what they type?

How It Works

Language models read the internet and try to predict the next word. The LIM watches behavior — scrolls, taps, skips, linger time — and learns what each person actually wants.

It requires far less data than an LLM, improves with every interaction without being fully retrained, and runs at roughly 500 times lower energy cost at inference. It never hallucinates, because it only surfaces answers it actually has.

This is not an incremental improvement on existing AI. It is a fundamentally different approach to machine learning — one built for a world where personalization, efficiency, and privacy actually matter.

The Evidence

This isn't a whiteboard concept. The LIM has been in production for years.

Product Genius — founded by Ben Vigoda and built on LIM technology — serves over 15 million monthly shoppers across hundreds of ecommerce businesses, with more than 470 million shopping interactions processed to date. In controlled A/B tests, stores using the LIM see measurable revenue lift from personalization that adapts to each individual shopper in real time.

> Don't take our word for it.

Beyond Ecommerce

Ecommerce was the first vertical. The same engine now powers movie discovery, cybersecurity intelligence, game recommendations, and sports highlights — all live on this site. The LIM is domain-agnostic. That's what makes it infrastructure, not just a product.

Try it yourself →

API Reference

Ready to build? Browse the interactive Gray Whale API reference for LIM deployment, feed management, and WebSocket integration.

View API Docs →