AI search is already doing real work.
People are using ChatGPT, Google Gemini, Claude, and Perplexity to decide what to buy, who to hire, and which products are worth their time. Not occasionally, but repeatedly. ChatGPT alone is approaching a billion weekly users, and AI tools broadly are already used by well over a billion people. The exact numbers will move, but the direction is settled. This is already a primary layer of discovery.
What’s missing is visibility. Brands are being described, compared, and recommended inside systems they cannot inspect. There is no native reporting layer, no reliable attribution, and no clean way to trace how a given input produces a given output. From the outside, the system looks coherent. From the inside, it is still opaque.
That should feel familiar.
In the early 2000s, search behaved the same way. Rankings mattered, but they were unstable and difficult to explain. Operators worked from fragments, testing changes, watching what moved, and building models of the system that were only partially correct. It worked well enough to build businesses, but not well enough to scale with confidence. Search only stabilized once Google exposed part of its internal view. Not everything, but enough. Indexing, queries, and technical issues. Cause and effect became visible, even if the system itself remained complex. That visibility created alignment.
AI search does not have that native layer yet.
What exists instead is a growing set of tools attempting to reconstruct it from the outside. Ahrefs, SEMrush, and Profound all provide useful signals. They allow you to run prompts, observe outputs, and identify patterns. But they are still observational. You are measuring the surface and inferring the system beneath it.
We took a different approach.
oakpool.ai was built to make this environment operable. Not perfectly transparent, but sufficiently legible to act with confidence. We have built a visibility layer that tracks how brands appear across AI platforms, how they are described, which sources are consistently pulled into those answers, and how that representation changes over time.
The distinction is important. Most teams are still asking whether they show up. We are looking at how they are framed, why that framing occurs, and what needs to change to influence it.
That work does not live in a dashboard alone. It connects directly to execution. Content structure, authority signals, PR, technical foundations. The system may be partially opaque, but the levers are not.
This is where the opportunity sits.
The companies that win this phase will not be the ones waiting for a clean, first-party analytics layer to arrive. They will be the ones who learn how to operate before it does. This is how every major channel has developed. Search did not wait for Search Console. Social did not wait for attribution models. Programmatic did not wait for perfect tracking.
AI search will follow the same path.
A first-party visibility layer will emerge. It always does. When it does, the system will become easier to measure and easier to explain. By then, the advantage will be harder to take.
Right now, it is still available.