Now Available - One-Time Audits! See Pricing

Google’s URL Inspection Tool: Replacement for Fetch and Render (How to Guide)

Google's URL Inspection Tool Reading Time: 4 minutes

If you are serious about your business’s marketing efforts, you are consistently posting new, high-quality content. You should also optimize such content to help your users solve a problem in their lives, provide more information about one of your products or services, or otherwise serve some purpose that furthers your business. Similarly, you should be consistently monitoring your web page and its existing content, ensuring all of its components run as smoothly as possible and updating those that do not.

However, what happens if your changes contain code or other elements that negatively alter the way Google reads your page? After all, things look fine on the surface and your web page appears better than ever. Could the changes you can’t see with the naked eye be affecting your SEO? As it turns out, the answer is yes.

Googlebot Indexing

In order for potential customers to find your web page via a Google search, it must be visible in the Google index. All this means is that Google knows your page is out there, functional, and ready for return as a potential web search result block on a search engine results page. However, to make this determination, your page must be 100% visible and accessible to Google.

To determine this, Google sends Googlebot – a web crawler, also known as a bot or spider – to scan the contents of your page. Googlebot and other web crawlers exist to constantly gather information about web pages and relay the information they find back to an entity; Googlebot, of course, relays the information it finds back to Google, which either indexes a page or not based on Googlebot’s findings.

Googlebot receives the URL for your page from a list, then scans and retrieves its contents, including all text, code, plug-ins, and other elements that make up your page. Googlebot also analyzes any links your page makes to other pages. Once it finishes the analysis, the bot sends all the information back to Google, which indexes all pages Googlebot can “see.”

After indexing, Google proceeds to rank your page based on the more familiar aspects of speed, quality, and relevance. However, if Googlebot cannot see your page, it does not index it properly and Google may not rank your page. If Google does not index and rank your page, your potential customers will not receive your page as a suggestion in a Google search, thereby damaging your traffic and eventual conversions.

Why Can’t Googlebot Access Your Page?

Despite the fact that users see your web page as a collection of text, photo, video, and other content they can read and interact with, Googlebot only sees your page as a set of individual components of code. Instead of a photo, Googlebot sees the HTML file for that image. Instead of a flash component, Googlebot sees the coding for the component. If you have components of your page that Googlebot cannot see or access, it reports the information back to Google.

Multiple situations can cause unreadable page components. For example, if you have components of your page that use CSS, JavaScript or Flash, you could be a victim of bad coding. Or, the links could be incorrect or blocked by robots.txt. Alternatively, your site could rely too much on Flash or over complicated dynamic links, preventing Googlebot from viewing the page as it actually is.

URL Inspection Tool (Previously Fetch and Render) Can Help

As a site owner, and a human, you can see the contents of the page as you meant them. However, you cannot see the behind-the-scenes coding or read errors that may prevent Googlebot from seeing your page. In fact, you may not even know there is an issue until you begin experiencing a sharp, unexplained, drop in your clickthroughs.

Fortunately, Google provides a tool to help you determine how Googlebot sees your page so that you can identify and solve any issues that arise and ensure Google indexes your page. You can request what is essentially a practice crawl of your site, and ask Google to render it to view your page as Google views it. Google used to call these tools “fetch as Google” and “fetch and render.” However, Google fetch and render is now the URL Inspection Tool. It provides the same information about Google’s indexed version of a webpage. Here’s how to use the tool:

  1. Verify. First, sign into your Google account and verify that you have website authority. Only then can you access the URL Inspection Tool feature. This step will record your ownership information for the page into Google’s systems. Download an HTML verification file the page provides you, and then confirm its successful upload by clicking on a supplied link and then indicating “verify.” The new HTML file should remain in place even after you verify that you own your site, to ensure that your site remains verified.
  2. Access. Second, you’ll need to access the Google Search Console. Under Domain, the Console will prompt you to enter a URL for analysis into its a search bar. Input your site’s URL and click “Continue.” You can also paste your URL under “URL Prefix” if you only want to inspect URLs under the entered address, rather than your entire domain. Like Google fetch and render, the URL Inspection tool will analyze the URL based on Google’s last bot crawl.
  3. Crawl. Google will crawl the provided URL and either tell you that your URL is on Google and is visible in search results or that your URL is not on Google. If your site is not indexed, you will receive a reason why, along with recommended solutions. Following these tips can help you fix common issues that may be making your web page uncrawlable.
  4. Inspect. When you run Fetch and Render/URL Inspection Tool, you will see an inspection report showing you how it renders to Google. If the Googlebot rendered page appears as blank or erroneous, you likely have issues within your page’s coding that prevent Googlebot from reading it fully. You then have an opportunity to address any crawl errors on the Search Console homescreen.
  5. Repair. Select “Crawl Errors” to see the errors returned from previous crawls of your site. As mentioned, there are multiple types of errors of varying severities that may be affecting how Googlebot views your site. Note that if your page does not conform to Google’s quality and security guidelines, your page might not be visible even if you receive the “URL is on Google” memo. The tool does not see content removals or temporarily blocked URLs.

The URL Inspection Tool has other features as well, such as inspecting an indexed (not live) URL, viewing a rendered version of the page, and accessing loaded resources lists. Once you’ve diagnosed your errors, you can fix them to ensure that Googlebot is able to access your site. If you need assistance with this, your SEO provider can help you continue to optimize and ensure proper interaction between your site and Google. Then, Google will have all the information it needs to index and rank your site so you can get back to business.

At Vizion Interactive, we have the expertise, experience, and enthusiasm to get results and keep clients happy! Learn more about how our SEO Audits, Local Listing Management, Website Redesign Consulting, and B2B digital marketing services can increase sales and boost your ROI. But don’t just take our word for it, check out what our clients have to say, along with our case studies.