The visitor who arrived pre-sold
Two visitors land on your site this week. One found you through organic search. The other typed their frustration into ChatGPT, got enough of an answer to know what they needed next, and followed a recommendation to you. By the time they clicked through, the decision was mostly made.
That gap, between being found and being chosen, is what generative engine optimisation is really about. AI tools like ChatGPT and Perplexity work differently from Google. They send you people who've already done the groundwork. The visit comes after the consideration, not before it.
The conversion data backs this up. AI referral volume is still small. What matters is what those visitors do when they get to you.
What the conversion data actually shows
The most conservative figure comes from Visibility Labs, who tracked 94 ecommerce brands through GA4 over 2025. Visitors arriving from ChatGPT converted at 1.3x the rate of non-branded organic traffic.
Other studies push the numbers much higher. Semrush found a 4.4x conversion premium across a broader industry sample. Rocket Agency's 18-month study came back with 5.1x. Similarweb put AI conversion at 2.2x organic across global ecommerce. Amsive found 56% of sites in their study showing higher conversion from AI traffic than organic. Ahrefs published their own site data showing 23x, with 0.5% of visitors from AI driving 12.1% of signups.
The figures vary widely but all point the same way. AI visitors convert better, across every study that has measured it.
Conversion Rate by Platform — B2B Software (Seer Interactive)
Source: Seer Interactive, single B2B software client, 2025
The platforms also behave differently. ChatGPT traffic has a broad reach and converts across solution and contact pages. Gemini visitors tend to focus on tools and calculators. Perplexity sends smaller volumes but concentrated ones, people already close to a decision. For a crypto protocol deciding where to focus first, ChatGPT has the widest reach.
AI visitors also spend 68% more time on site than organic visitors. Microsoft's data on Copilot users shows user journeys are 33% shorter and 76% more likely to take action. AI visitors arrive knowing what they want and move through the site to get it.
Why it happens: the mechanism
When someone opens ChatGPT and types a question, the AI does not return a list of links. It synthesises an answer drawn from multiple sources, comparing options and filtering out the noise. By the time a recommendation appears and the user clicks through, the comparison work is already done. They were not browsing a list, they were directed to you after the model had already weighed the alternatives. The same applies to Google AI Overviews. An answer first, links second.
This is mid-funnel displacement. The consideration phase that traditionally happens across multiple site visits gets compressed inside the AI conversation before the first click, which researchers call intent compression. A conventional Google search sends visitors at the start of that process. A ChatGPT referral sends someone nearer the end of it.
Most content teams measure success by session volume and time on site, with longer visits traditionally seen as more desirable. Those metrics only make sense when the research phase happens on your pages. When it happens inside the model, a short focused visit that converts is worth more than a long exploratory one that does not. Teams optimising for the wrong signals tend to undervalue AI traffic precisely when it is performing best.
The Microsoft Copilot data is the clearest illustration of this. Journeys 33% shorter, 76% more likely to reach a lower-funnel action. With AI-referred visitors, a short session is a good sign.
The volume trajectory
AI referral traffic is still small. Conductor and BrightEdge both put it at around 1% of total website traffic across major domains. What makes it worth paying attention to now is the direction.
- ChatGPT sessions grew 1,079% across 94 ecommerce brands in 2025 (Visibility Labs).
- AI referral traffic is growing at 527% year-over-year, 165 times faster than organic (Growth Marshal).
- The organic-to-ChatGPT traffic ratio shifted from 70:1 in Q1 2025 to 47:1 by Q4. The gap is closing fast.
ChatGPT-attributed revenue across tracked brands averaged 1.48% for the full year 2025. In the second half of the year alone it reached 2.2%. This figure only counts the slice of revenue traceable back to a ChatGPT referral session in GA4. It excludes Perplexity, Gemini, Claude, and any AI-influenced visits that got misattributed to direct or branded organic. The real combined AI revenue contribution would already be higher and growing fast.
Rank Science projects AI-driven conversions at comparable levels to Google by late 2027, assuming the current trajectory holds.
The compounding argument matters. AI citation authority builds over time through structured data, content architecture, and brand mentions that accumulate. Brands building that infrastructure now will have an 18-month head start on the brands that wait until the volume becomes impossible to ignore. That gap will not close quickly once it opens.
The attribution problem (why the real number is probably higher)
The conversion data covered above is almost certainly an undercount.
When someone clicks through to a site from ChatGPT, GA4 does not log that visit as "from ChatGPT." It logs it as direct traffic. Or as branded organic, if the visitor searches your name based on AI results before clicking. Google AI Overview clicks on their search engine results page are also indistinguishable from regular organic visits. Those are three misattribution paths already, all resulting in direct traffic or branded organic inflation.
The fourth type of misattribution is less obvious. ChatGPT has historically failed to pass referral data correctly at the platform level. Patrick Stox noted in his Ahrefs piece that this is not just a GA4 parsing problem. ChatGPT itself has not always sent the referral signal. So even with correctly configured analytics, some portion of AI-referred traffic is counted as direct traffic.
This misattribution would matter less if brands were at least trying to measure AI traffic. McKinsey found only 16% are. The other 84% are working with data that structurally undercounts AI's contribution even before any analysis.
The numbers almost certainly understate the case, due to four misattribution paths affecting the dataset.
How to set up AI traffic tracking in GA4
Basic AI traffic visibility is straightforward to set up. Three steps:
-
1
Create a custom channel grouping in GA4. Go to Admin, then Data Display, then Channel Groups, and add a new channel. Name it something clear. 'AI Referral' works.
-
2
Apply a regex filter to the Source dimension. Seer Interactive's regex covers the main platforms:
-
3
Add a post-purchase or post-action survey question. Analytics only catches what the referral header passes. A simple "how did you hear about us?" captures AI-influenced visits that will never register as AI visits, appearing instead as branded organic or direct traffic that started with an AI recommendation.
Most teams that set up this filter find AI's contribution larger than their dashboard suggested. Given the four misattribution paths above, that difference is the expected outcome.
The honest counterpoint
Not every AI interaction results in a visitor being referred. Many will not, as a significant number of AI answers will provide enough information to satisfy user intent without clicking through to the source at all.
Pew Research puts the zero-click rate at 60% across AI search interactions. Bain found that number rises to 83% when a Google AI Overview is present on the search results page.
The most detailed counterpoint comes from a working paper by researchers at the University of Hamburg and Frankfurt School of Finance & Management, covering 973 ecommerce sites and $20 billion in annual revenue, which found organic search outperforming ChatGPT by 13% for transactional purchases. However, the paper has not been peer-reviewed yet and suffers from one of the same misattribution paths mentioned above: it measures last-click referrals only, meaning any AI influence earlier in the research phase goes uncounted. That second limitation actually cuts against the paper's own conclusion.
Context matters too. DeFi investment decisions, protocol comparisons, developer tool evaluations. These are not the impulse buy 'add-to-cart' moments of the University of Hamburg paper. For typical crypto queries the research phase is longer and the visitor arriving via AI recommendation has done considerably more work before clicking. The conversion premium holds precisely because the decision type is completely different.
One further caveat: AI citation behaviour changes with every model update. Conversion rates measured in 2025 may look different by the end of 2026. The direction of the data is consistent across enough sources to be credible, but model behaviour comes with no guarantees.
What this means specifically for crypto protocols
Most of the conversion data in this article comes from ecommerce and broad B2B studies. Crypto protocols sit in a very different category with their own technically literate audiences making detailed research before any commitment is made.
That actually makes the AI conversion premium more valuable for crypto brands. The mid-funnel displacement effect is more pronounced for crypto audiences as they tend to do much more research before acting compared to impulse driven ecommerce audiences. By the time an AI generates a recommendation link in response to a specific crypto query, the user has usually done considerable work inside the conversation already.
There is a useful parallel in professional services research: 81% of buyers choose a vendor before talking to sales. For crypto protocols the equivalent is that most developers and investors have formed a view on a protocol before any direct contact happens. This view is formed through research, discussions, and AI-synthesised comparisons. The AI recommendation is often where the decision is made. The site visit is confirmation.
Crypto can be a noisy space where many projects look identical at first glance. An AI recommendation carries implicit endorsement for a protocol trying to stand out in a crowded category. The model has evaluated alternatives and surfaced this one.
Getting into those recommendations is a content and structured data problem. The entry mechanism is comprehensive schema markup and content architecture built for machine readability, structured so AI models know not just what a protocol does, but what category it belongs to, what problems it solves, and how it differs from alternatives. How AI Models Decide What to Cite covers the full mechanism in detail — entity salience, topical authority, answer-first formatting, and a platform-by-platform breakdown of what drives citation on ChatGPT, Perplexity, and Google AI Overviews.
It makes sense to start planning a content strategy to maximise high-converting AI visitors right now rather than waiting for the AI referral volume to become impossible to ignore. The protocols building that infrastructure now will be competing on very different terms to the ones who start in 2027.
Organic visitors find you. AI visitors have already chosen you.
The question is whether you have given the AI enough to recommend you with. That is a content problem. It is also a structured data problem. If you need help with either, that is what this site is for.
Ready to get cited by AI search?
Find out how content built for machine readability can put your crypto project in front of pre-sold visitors.
Get in touch