This guide describes how to optimize Google’s crawling of very large and frequently updated sites.
If your site does not have a large number of pages that change rapidly, or if your pages seem to be crawled the same day they are published, you do not need to read this guide. Merely keeping your sitemap updated and checking your index coverage regularly is adequate.
Who this guide is for?
This is an advanced guide and is intended for:
- Large Sites (1 million + Unique Pages)
- Medium Sites (10,000+ Unique Pages)
Introduction to Crawling:
The web is an enormous space that Google cannot fully explore or index. Because of this, there are limits to how much time and resources Googlebot can spend crawling any single site. The amount of time and resources Google devotes to crawling a site is commonly called the site’s crawl budget.
Note that not everything crawled on your site will necessarily be indexed; each page must be evaluated, consolidated, and assessed to determine whether it will be indexed after it has been crawled.
The crawl budget is determined by two main elements: crawl capacity limit and crawl demand.
Crawl Capacity Limit
Googlebot is designed to crawl your site as efficiently as possible without overwhelming your servers. Google calculates a crawl capacity limit, which is the maximum number of simultaneous parallel connections that Googlebot can use to crawl a site, as well as the time delay between fetches. This prevents overload on your server.
The crawl capacity limit can go up and down based on a few factors:
- Crawl Health: If the site responds quickly for a while, the limit goes up, meaning more connections can be used to crawl. If the site slows down or responds with server errors, the limit goes down and Googlebot crawls less.
- Limit Set by site Owner in Search Console: Website owners can choose to reduce Googlebot’s crawling of their site. Additionally, they can add a sitemap through the Google Search Console. Using both of these tools can increase your site’s exposure on Google.
- Google’s Crawling Limits: Google has a lot of machines, but not infinite machines. We still need to make choices with the resources that we have.
Crawl Demand
Google typically spends as much time as necessary crawling a site, given its size, update frequency, page quality, and relevance, compared to other sites.
The factors that play a significant role in determining crawl demand are:
- Perceived Inventory: By providing guidance to Googlebot, you let it focus its crawling energy on the pages that matter most. This will reduce the number of URLs that Googlebot has to crawl and process, which will make your site load more quickly.
- Popularity: Since the most popular websites on the Internet are crawled more often to keep them fresher in our index, you should make sure that your website has a unique and human readable URL.
- Staleness: Google systems want to recrawl documents frequently enough to pick up any changes.
Additionally, site-wide events like site moves may trigger an increase in crawl demand in order to reindex the content under the new URLs.
Google defines a site’s crawl budget as the set of URLs that Googlebot can and wants to crawl. Even if the crawl capacity limit isn’t reached, if crawl demand is low, Googlebot will crawl your site less.
Best Practises:
Follow these best practices to maximize your crawling efficiency:
- Manage URL Inventory:
It’s important to use the right tool for the job. For example, you can tell Google which pages on your site it shouldn’t crawl by using the robots.txt file on your domain’s root directory or a sitemap if you have one. If you don’t want Googlebot to crawl any of your URLs, add
- Consolidate Duplicate Content: Eliminate duplicate content to focus crawling on unique content rather than unique URLs.
- Block crawling of URLs that you don’t want indexed: If you want to decide which pages of your site to include in the index, you need to use robots.txt and meta tags. Robots.txt lets you tell Googlebot (and other crawlers) which URLs on your site it should crawl and which it shouldn’t. You can also specify whether or not a page is cached or noindexed by using the appropriate
- Eliminate Soft 404s: Soft 404s will continue to be crawled, and waste your budget. Check the Index Coverage report for soft 404 errors.
Conclusion:
Crawl budget optimization is a critical step for any online business, but most businesses fail to consider all of the factors that go into it. This guide will teach you how to optimize your crawl budget with best practices that will help your site load faster and increase its overall performance.
If you need help optimizing your website to meet today’s modern standards, please don’t hesitate to contact us at AI Advertisment for further assistance.