AI Search Shows Higher 404 Pages Compared to Google | Study Findings
AI Search Shows 404 Pages More Frequently
A new study has shown that AI-powered search assistants are much more likely than Google to send visitors to web pages that don’t work. The data show that AI-generated searches lead to 404 error sites almost three times as often as regular Google Search results. As AI search tools become more popular, this makes me quite worried about how accurate and reliable they are.
ChatGPT Excels at Generating URLs.
AI Tools Have More Errors
ChatGPT has the most broken or fake links of all the AI technologies that were tried out. Ahrefs study showed that around 1% of all the URLs that people clicked on from ChatGPT took them to error pages. Google’s percentage was only 0.15%.
The problem got worse when I looked at all of the URLs that were referenced, whether or not they were clicked. ChatGPT’s advised links led to broken pages almost 2.38% of the time, while Google’s top search results only did so 0.84% of the time.
How Other AI Tools Performed
- Claude: 0.58% broken links
- Copilot: 0.34%
- Perplexity: 0.31%
- Gemini: 0.21%
- Mistral: 0.12% (lowest error rate, but not much web traffic)
Why do AI searches bring up 404 pages?
Relying on links that are old or don’t exist
The study identifies two primary causes for AI search displaying 404 pages:
- Outdated References: Sometimes, AI models recommend connections that used to work but were later removed or moved.
- Fake URLs: AI often makes up web addresses that look real but don’t actually exist.
For instance, Ahrefs revealed that AI tools produced URLs like “/blog/internal-links/” and “/blog/newsletter/” that look real but aren’t on their website.
Traffic Not Affected Much
AI against Google’s share of traffic
There is a significant problem with 404 links, however right now it doesn’t have much of an effect on traffic. The study indicated that AI search tools only bring in 0.25% of all page visitors, whereas Google brings in over 39.35%.
This means that even though AI mistakes happen, they only effect a tiny part of total web traffic for now.
AI content is becoming more dangerous.
But things might become worse. Almost 74% of new web sites now have some kind of AI-generated content on them. If this page has bogus links in it, crawlers may find them and disseminate the problem around the web.
Advice and predictions from experts
John Mueller’s Early Warning
In March, John Mueller from Google said that there would be more hallucinated links in the next 6 to 12 months. He said not to go after every instance of false traffic, but to instead:
- Only use redirects for URLs that get a lot of traffic.
- This new study backs up his warnings and shows that his forecasts were right.
What This Means for Companies
Action Points for Website Owners
To prepare for this issue, businesses should:
- Ensure their 404 error pages are clear, helpful, and navigable.
- Keep an eye on incoming traffic and find broken URLs that people ask for a lot.
- Use redirects wisely, and only on links that genuine users click on.
Conclusion
The new study backs up what we already knew: AI search shows 404 sites significantly more often than Google, and ChatGPT is the most likely to make mistakes. The problem isn’t having a big effect on traffic right now, but it could get worse if more people start using AI search tools.
For now, the best strategy for businesses is to focus on improving 404 experiences, monitoring traffic sources, and preparing for long-term SEO adjustments as AI search evolves.
Do for latest news.