Google’s John Mueller confirmed that GoogleBot will crawl and pick up on URL patterns that simply do not work on your site. I mean, we have all seen this happen time and time again with sites we manage. But John added that the crawling should slow over time as Google picks up on this.
John said on Mastodon “Usually our systems pick up on URL patterns that don’t work and slow down crawling for them,” he then added “so it’ll likely get a bit better (but you’ll still see these warnings).”
This is in regards to sites getting this Search Console notice:
Here is a screenshot of the conversation if you don’t want to click through.
Forum discussion at Mastodon.
At its annual I/O developer conference, Google unveiled plans to incorporate generative AI directly into…
For your content to be impactful, search engines need to be able to find it…
Six months ago, I covered the famous (non) "buckle up" comments about the changes coming…
Google is introducing a filter that allows you to view only text-based webpages in search…
What Are Seed Keywords?A seed keyword is a keyword without any modifiers, also known as…
Gary Illyes from Google confirmed that Google has "suddenly" dedindexed a "vast amount of URLs"…
This website uses cookies.
Leave a Comment