Information & Trust
For 20 years, Google kept the question from forming. AI changed that. What's replacing it could be even less transparent.
Finder, Canstar and Compare the Market built businesses worth hundreds of millions on a single competitive advantage. Not independent research. Not editorial judgment. The specific way Google's algorithm rewards content at scale: a site publishing thousands of financial product pages outranks any individual bank's product page for the same searches almost by default. The comparison site industry figured that out early. It worked for about 20 years.
The pitch was always clean: they've done the research so you don't have to. A ranked table of financial products, free to use, independent.
But the revenue flows the other way. When you click through and apply, the provider pays a referral fee. Products at the top of the table tagged "Promoted" or "Top Pick" got there through commercial arrangements. The disclosure is technically on the site.
That model worked because almost nobody read it. It didn't need to hide. Google's algorithm produced a consumer journey smooth enough that scrutiny didn't form. You were already three steps through the process by the time you might have thought to ask who built this table and why.
AI has now disrupted that traffic. But what changed wasn't AI getting smarter at detecting commercial arrangements. The disclosure was always there. What changed was the architecture of the journey. The confidence trick needed a specific sequence. AI stopped that sequence. How the industry has responded makes clear this was always a model built around an algorithm, not around research.
Google's algorithm runs on authority signals: links from other sites, traffic volume, how long people spend on a page. A comparison site publishing 10,000 financial product pages generates those signals at a scale that any individual bank's product page can't match. Search "best credit cards Australia" and the results aren't dominated by ANZ or CommBank. They're dominated by comparison sites.
That authority wasn't accumulated passively through publishing volume. The deliberate play was commissioned research: consumer sentiment surveys, annual spending reports, cost-of-living trackers. These got picked up by journalists needing data to illustrate stories, published with attribution and a link back to whichever comparison site ran the study. Those links fed directly into the authority signals Google was measuring.
The research was all real. But the purpose was purely PageRank.
I've worked inside this model. The commercial relationships aren't hidden from the people building the product. They just don't tend to be front of mind for the people using it.
So when a comparison site appeared at the top of a financial search, it was something that had been engineered to be there through authority manufacturing, not because its recommendations were the most independent. The commercial arrangements sat in the fine print, technically disclosed. The journey kept the question from forming.
In 2019, the ACCC actually took iSelect to court. iSelect, a comparison site for energy, health insurance and telecoms, told consumers they would be shown all available plans in their area. The ACCC found they were being shown plans from retailers on something called the Preferred Partner Program, the ones who paid more for better placement. The most competitive option wasn't always what consumers saw.
The regulator didn't need specialist tools. It just checked the claim against the product. The structure was exactly what it looked like.
iSelect ended up paying a $8.5 million fine and kept running. The ACCC's broader review of the industry found patterns causing "consumer and business harm" across multiple comparison categories. The model the case exposed kept operating without structural change. Nothing about the underlying mechanism fundamentally changed.
The subindustry that's grown up around the disruption is called Generative Engine Optimisation. The pitch: instead of ranking at the top of Google results, get cited by ChatGPT or Google's AI Overviews when someone asks which credit card to get. Specialist agencies, workshops, newsletters, a growing circuit of consultants whose job is getting comparison content into AI answers.
The way GEO makes money is different from the model it's trying to preserve. GEO charges retainers for optimisation, fees for getting content cited by AI tools regardless of what consumers do next. The comparison site commission model only activates when someone clicks through and applies for a product. AI answers are specifically the thing stopping that click from happening. Getting cited solves the visibility problem. It doesn't solve the conversion problem, and the commission model runs on conversion.
For the comparison sites and any site built on affiliate link revenue, there lies an existential issue. The old model needed Google to route consumers through a comparison table. That table was where the commercial architecture lived: the ranking, the "Promoted" tag, the referral fee activated by the click. AI answers directly, without routing anyone through a table. The GEO play is trying to recreate the conditions that made the old one useful, inside a medium built specifically to skip those conditions. But this time, the comparison sites aren't the only ones running that play.
Banks, insurers and energy retailers are GEO-optimising their own product pages for the same AI answers. A provider that gets cited directly doesn't need a comparison site in the chain at all. There's a structural reason that race is hard to win: a bank's mortgage explainer just needs to be cited. It doesn't need a commission click to make commercial sense. The comparison site's content does.
Google search results came with labels. "Promoted." "Ad." Easy to miss, often ignored, but present and required. A faint signal that commercial arrangements were in play.
AI answers don't have an equivalent. When a Google AI Overview recommends a financial product or cites a comparison result, there's no label inside the answer. No disclosure mechanism. If comparison content successfully colonises AI answers the way it colonised search results, the commission structures and preferred arrangements will still be there. The same three-party transaction, comparison site, product provider and consumer, with the same information gap between the third party and the other two.
The comparison table was already the shadiest product recommendation most consumers encountered without knowing it. The AI answer is a step further. At least the label existed. Small, easy to miss, widely ignored. But there. The consumer going through an AI-mediated answer won't have even that.
The comparison site model didn't require anyone to be dishonest. It required the right algorithm, producing the right consumer journey, for long enough that the question of how it worked didn't come up often. For about 20 years, Google provided that.
The disruption everyone is writing about is real. What it isn't is AI suddenly seeing through arrangements that were previously hidden. The disclosure was always there. The consumer journey was designed to keep the question from forming. That design is what changed.
And so the GEO consultants are busy, the workshops are booked, and the specialist agencies are pitching. If they succeed, if the AI-mediated journey ends up as frictionless as the Google one was, the recommended credit card in a ChatGPT answer will have arrived there by largely the same means as the one that topped a comparison table in 2019.
The only thing missing will be the "Promoted" label.
For financial decisions that actually matter, mortgages, insurance policies, super fund choices, MoneySmart (run by ASIC) and the AER's energy comparison tool both operate without commercial relationships with providers. Neither is as polished as the commercial options. Both are worth knowing about. The ACCC has a complaints process if a comparison site has misled you about its independence. It acted on iSelect. It can act again.
Part of a series on how the information ecosystem actually works: The conversation you're reading isn't real and The newsroom is everywhere now.
· ◆ ·