Google Updates Deep Links Rules, Robots.txt Docs & EU Pushes for Search Data Sharing
This week brought some meaningful developments from Google that touch core SEO practices — from how your snippets display deep links to how Google parses your robots.txt file. On top of that, regulators in Europe are pushing for something that could reshape the search landscape for AI tools. Here’s a breakdown of what happened and why it matters for your site.
Google Publishes Its First Guidance on “Read More” Deep Links
Google has quietly added a dedicated section to its snippet documentation covering “Read More” deep links — those additional jump-links that sometimes appear below a snippet in search results, pointing users directly to specific sections of a page.
This is the first time Google has spelled out what it expects from content for these links to show up. Three conditions stand out:
1. Content Must Load on Page Load
If a section of your page sits behind an accordion, a tab, or a click-to-expand toggle, Google is less likely to pick it up as a deep link candidate. The content has to be visible the moment the page renders — no interaction required.
2. Headings Matter
Sections should use H2 or H3 tags. This signals structure to Google’s systems and helps it identify discrete, linkable parts of your content.
3. Snippet Text Must Match Page Content
If your meta description or snippet preview doesn’t align with what’s actually on the page, or if content only loads after user scrolling, your chances of getting deep links drop significantly.
What makes this notable is the pattern it reinforces. Google has consistently favoured content that crawlers and bots can read without simulating user behaviour. The guidance here mirrors what the search community has seen applied to featured snippets, FAQ rich results, and AI Mode in Google Search — content that’s immediately accessible gets the most visibility.
For sites leaning heavily on JavaScript-rendered content, expandable FAQs, or tab-based product detail sections, this is a practical audit trigger. If core information lives inside a collapsible element, it may not qualify for deep links — and possibly performs worse in other Google search ranking formats too.
Google May Expand What It Documents in Robots.txt
Google’s Search Off the Record podcast recently featured Gary Illyes and Martin Splitt discussing a project that involved analysing real-world robots.txt files at scale using HTTP Archive data. The findings are shaping plans to expand Google’s official robots.txt documentation.
Currently, Google only formally supports four directives: user-agent, allow, disallow, and sitemap. Everything else is either ignored or handled inconsistently. What Illyes described is a move to document the most commonly used unsupported directives — somewhere in the range of 10 to 15 rules — so site owners have clearer guidance on what Google actually does with those lines.
There was also mention of potential expanded tolerance for typos in the disallow directive. Illyes indicated the parser already handles some misspellings and that more may be accepted over time.
Why does this matter? For anyone managing a robots.txt file that includes custom directives, third-party tool recommendations, or rules copied from templates, this is a prompt to audit. This connects directly to the broader principle we’ve been tracking: how Google actually crawls your website in 2026 is increasingly transparent, and documentation is catching up to practice.
The HTTP Archive data Illyes referenced is publicly available via Google BigQuery, meaning any SEO can run the same kind of analysis to understand how common specific directives are across the web.
EU Proposes Google Share Search Data With Competitors — Including AI Chatbots
The European Commission has sent preliminary findings to Google proposing that it share search data with rival search engines operating in the EU and EEA. The proposed data categories are ranking signals, query data, click data, and view data — all to be shared on fair and non-discriminatory terms.
What makes this proposal particularly significant is who it includes: AI chatbot providers that meet the Digital Markets Act’s (DMA) definition of an online search engine. If the proposal holds through its final decision on July 27, qualifying AI chatbot products — not just traditional search engines — could gain access to anonymised Google Search data.
A public consultation period is open until May 1, so nothing is binding yet. But the regulatory framing matters. The EU has officially blurred the line between “AI assistant” and “search engine,” and that definition has downstream consequences for how the entire category is regulated.
For SEOs and content publishers focused on EU/EEA audiences, this opens a real question: if AI chatbots gain access to Google’s anonymised query and ranking data, how might that change the retrieval and citation patterns of those tools?
The broader regulatory picture is worth watching. The Google March 2026 Core Update already rattled rankings significantly — adding regulatory-driven changes to search data distribution could introduce another variable for publishers operating in European markets.
The Week’s Bigger Theme: Clarity Is Replacing Ambiguity
Each of these stories shares a common thread. Google is putting things in writing — what it supports in robots.txt, what structures help deep links appear, and what the EU expects it to share and with whom. For years, a lot of SEO decision-making relied on inference and pattern recognition because official documentation was thin.
That’s changing. And it cuts both ways. Clearer guidance makes audits easier and gives teams stronger internal justification for technical changes. But it also removes the wiggle room that came with uncertainty. “We weren’t sure” stops being a reasonable explanation when the documentation exists.
For now, the practical to-do list coming out of this week is short and concrete:
- Check whether important content loads without user interaction
- Audit your robots.txt for directives Google doesn’t support
- Watch the EU consultation deadline of May 1 if you serve European audiences
Also worth noting — Google search ranking volatility has been picking up again in late April 2026. If your traffic has shifted this week, that context is worth factoring in alongside any technical changes you’re planning.
Stay updated with the latest Google and SEO news at news.opositive.io
