Google Search Central Blog
Find the latest Google Search core algorithm updates, new features, and SEO best practices on the Google Search Central Blog.
In the previous posts about the Robots Exclusion Protocol (REP) we explored what's already possible to do with its various components — namely robots.txt and the URI level controls. In this post we will explore how the REP can play a supporting role in the ever-evolving relation between automatic clients and the human web.
Hello 2025! (Yeah, we know, time flies!) We've had some exciting plans in the works for Search Central Live (SCL) Asia Pacific this year, and we're super excited to let you in on what we've been up to. We've been listening closely to your feedback, and we're cooking up something different than what we usually do–something bigger, deeper, and more tailored to you!
With the robots.txt file, site owners
have a simple way to control which parts of a website are accessible by crawlers.
To help site owners further express how search engines and web
crawlers can use their pages, the web standards group came
up with robots meta
tags in 1996, just a few months after meta
tags
were proposed for HTML (and anecdotally, also before Google
was founded). Later, X-Robots-Tag
HTTP response headers were added.
These instructions are sent together with a URL, so crawlers can only take them into account
if they're not disallowed from crawling the URL through the robots.txt file. Together, they
form the Robots Exclusion Protocol (REP).
A long-standing tool for website owners, robots.txt has been in active use for over 30 years and is broadly supported by crawler operators (such as tools for site owners, services, and search engines). In this edition of the robots refresher series, we'll take a closer look at robots.txt as a flexible way to tell robots what you want them to do (or not do) on your website.
We're very excited to announce that Search Central Live is going to Madrid for the first time on April 9! The event will have a mix of presenters from the Google Search, News, and Partnerships teams and the content will be delivered in English and Spanish, but we'll have live translation.
On April 2, 2025 we'll be in Johannesburg, South Africa for the very first Search Central Live event in Africa! We're excited to welcome you at Search Central Live South Africa and talk about all things Google Search!
Every now and then we get questions about robots.txt, robots meta tags, and the control functionality that they offer. Following our December series on crawling, we thought this would be the perfect time to put together a light refresher. So, if you're curious about these controls, follow along in this new blog post series!
Mobile searchers will soon see a cleaner, more streamlined look for how URLs appear in search results. Initially introduced as part of the "site hierarchy" feature, we've found that the breadcrumb element isn't as useful to people who are searching on mobile devices, as it gets cut off on smaller screens. Starting today, we'll no longer show breadcrumbs on mobile search results in all languages and regions where Google Search is available (they continue to appear on desktop search results).
It might happen that by the end of this post you're going to try to decide who wrote this blog post, a large language model (LLM) or Gary. And you'd be right to ponder that and delve into the intricacies of the language used that gives away LLMs, for this is the time of the year when we can get away with publishing a blog post with barely any review (future Gary will deal with the potential, nay, likely fallout I guess). As we often do in the last post of a year, we're looking at what happened on Google Search Central in 2024 according to an LLM (or Gary), and maybe hinting at what might be coming in 2025 (but maybe this is just a hook to keep you reading...).
Content delivery networks (CDNs) are particularly well suited for decreasing latency of your website and in general keeping web traffic-related headaches away. This is their primary purpose after all: speedy delivery of your content even if your site is getting loads of traffic. The "D" in CDN is for delivering or distributing the content across the world, so transfer times to your users is also lower than just hosting in one data center somewhere. In this post we're going to explore how to make use of CDNs in a way that improves crawling and users' experience on your site, and we also look at some nuances of crawling CDN-backed sites.
Faceted navigation is a great way to help users find what they need on your site, but it can create an SEO nightmare if not implemented carefully. Why? Because it can generate a near-infinite number of URLs, which causes all sorts of crawling problems.
The Search Central Live events in Kuala Lumpur and Taipei were nothing short of amazing, in large thanks to the over 600 people who attended the events! We were thrilled to see the level of enthusiasm and engagement from attendees even if, on the day prior to the Taipei event, we collectively had to deal with typhoon Kong Rey, the first supertyphoon in Taiwan's history to make landfall after mid-October. Here's a deeper dive into what made these events so special and what's next.