For two decades, the dominant strategy of the internet was simple: collect more behavior than your competitors, correlate it harder, and turn those correlations into recommendations.
Amazon learned that if you bought X, you might buy Y, because it had seen millions of people buy those pairings. Google refined search results by watching what people typed, clicked, and typed next. TikTok mastered attention by observing what held your eyes for more than a second. These systems grew powerful because correlation scaled with user count.
But as Benedict Evans recently argued, these systems never understood why any of this happened. They behaved like dogs who know the jingle of keys means “walk,” without understanding what a key is. Understanding wasn’t part of the stack. It was an emergent property of enormous scale. And it came with a brutal cold-start problem: how do you recommend anything before you have users, and how do you get users before you can recommend anything?
LLMs break this open. They introduce something qualitatively different: the capacity to interpret content, context, and latent intent. Amazon’s correlation engine knows packing tape correlates with bubble wrap. An LLM might infer that you are probably moving and surface home insurance or broadband. More importantly, none of this requires your own massive user base.
Evans puts it plainly. You might not need to build your own Mechanical Turk anymore. If understanding generalizes enough, it becomes something you can call through an API. You can rent the cold start.
Evans frames this as a new turn on discovery and filtering. I think the implications run deeper. This shift challenges the foundations of product strategy, organizational design, and the governance assumptions underlying digital life. What we are seeing is the Digestion Gap at a different altitude: not too much information but too much understanding, arriving faster than institutions can absorb.

When Understanding Becomes a Commodity
If anyone can rent advanced understanding, being “smarter” stops being a moat. The competitive battlefield moves somewhere else. Execution speed becomes strategy. Customer intimacy becomes differentiation. Unique assets such as proprietary relationships, earned trust, and domain-specific data become leverage. Value creation shifts from insight to application.
A simple example makes this concrete. Compare a hedge fund in 2010 with one in 2030. In 2010, alpha came from better models, better data, faster signals. In 2030, the models are commodities, the data is abundant, and the signals are accessible to anyone with an API key. The differentiator is not who sees the world first but who acts on insight fastest and with the least friction.
This is what I mean by organizational metabolism. It is not a fuzzy metaphor for adaptability. It is a measurable property: how quickly an institution can take external understanding and convert it into changed behavior. How many layers a decision passes through. How long it takes for insight to become action. How much fidelity is lost along the way.
When intelligence becomes a utility, metabolism becomes the constraint.

The Rise of Personal Agents
Evans makes another crucial observation. Every major platform sees only a slice of you. Amazon sees purchases. TikTok sees attention. Google sees questions. Your phone sees more but reasons less.
Personal agents will see everything you permit and start stitching it together. That isn’t just more context. It is a different type of context altogether: continuous, cross-platform, and memory-rich. An agent that books your travel, manages your calendar, and reads your messages begins to understand not only your preferences but your patterns. Your rhythms, your relationships, your evolving priorities.
We are early in this shift. Today’s agents are uneven and prone to error. But the direction is clear. Agents that maintain long-term memory, gather context autonomously, and coordinate across specialized sub-agents will change how people interact with digital systems. For users who opt in, the cold start problem shrinks dramatically and eventually disappears.
This creates its own Digestion Gap. Employees augmented by agents will generate more context, more insight, and more potential actions than leaders or systems can realistically absorb. The constraint moves from knowing to deciding.

Governance When Models Infer What You Never Disclosed
This is where the real tension emerges.
When a system recommends products based on what you bought, there is a kind of honesty to it. You gave it the inputs. It gave you the outputs. But when models can infer what you never disclosed, the social contract breaks. Health conditions predicted from purchases. Financial stress inferred from browsing habits. Political leanings derived from reading patterns.
Users lose control because they never intentionally shared the information being acted upon. Inferences feel like surveillance even when they are just machine pattern recognition. Organizations can no longer credibly claim they only use data customers provided. Regulators increasingly treat inferences as sensitive data in their own right. And the entire notice-and-consent paradigm collapses because there is no way to consent to inferences you cannot anticipate.
This is not a technical issue. It is a legitimacy issue. And it will demand new governance frameworks that answer questions most organizations are not yet prepared to confront.
- What are organizations permitted to infer?
- What must they never infer, even if the system technically could?
- What rights do people have over insights derived from their behavior?
- What is the appeal process when an inference is wrong and consequential?
Inference governance is on track to become the defining ethical challenge of the next decade. We are behind the curve.
The Question of Our Time
Evans ends his piece with a sense of humility. We know this is big, he writes, but we do not know how any of it will work. I would add one thing. We know where the real pressure will land.
Understanding is no longer scarce. Absorption is. The organizations that will win in this era are not the ones with the best models or the most data but the ones designed to metabolize intelligence quickly, cleanly, and responsibly. Culturally, architecturally, strategically, and ethically.
We are no longer competing on knowledge.
We are competing on uptake.
That is the game now.
Madam I’m Adam
Discover more from AdamMonago.com
Subscribe to get the latest posts sent to your email.