The latest announcement from Google is an exciting one for website owners who want to know what the Googlebots see when they crawl a specific URL. This is important information if you want to be able to correct problems with any of your website pages, so that you feature highly in search engine rankings and attract more traffic to your site.
It’s only a few months since Google released the beta version of its new Search Console . The URL inspection tool is a stand-out feature of this. The tool has only just been announced by Google, and is being rolled out to Search Console users over the coming months. What can you expect?
What does the URL inspection tool do?
Google has introduced the new tool to help you see how the pages of your website are indexed, and to enable you to identify any issues that need to be addressed. All you have to do is enter a URL that you own into the tool. You will then be able to see specific information about the indexing of that URL, and how this indexing has been achieved:
- Last date on which the URL was crawled.
- The status of the crawling.
- Identified crawling or indexing errors.
- Details of the canonical URL for the page.
The information you can see is in depth. You will be able to see any enhancements that Google identified on your page. This can include having an AMP (Accelerated Mobile Page) in place. You will also be able to see if your page has not been indexed and why this has happened. This information includes details of all URLs that have not been indexed for the same reason. This means that you can identify, and resolve, issues with several different pages of your website at the same time.
Why is it so important that your web pages are indexed?
You will probably be aware that a good web presence is an essential tool for any business. This is why you need to invest time and money in an effective content marketing strategy. The problem is that there is little point in making this investment if your website is not included in search results, so that it can attract traffic. This is what happens if your web pages are not indexed as they should be. There are several reasons for pages not being indexed.
Google cannot see the page
Google does not automatically see all new web pages that are created. If you create a new page, make sure that it links to other pages on your site, so that Google is more likely to identify it. You can also submit a new site map to Google using the Search Console, or even just tell Google that you have created a new page.
The Noindex attribute is in place
This attribute is normally used when you do not want Google to index a page for some reason. It can block Google from indexing a page if it’s still incorrectly in use. The attribute can be found in the header of a page or in a robots.txt file which is in the root directory. If you have any problems identifying the attribute, talk to your web designer.
You have been penalised by Google
There are certain black hat techniques that result in pages being de-indexed by Google. You may need to check that you are not using any of these techniques on any of your pages.
Your website has been compromised
If Google spots that your website has been hacked, or there are issues such as malware being present, it will block your page from being seen by web users, in order to protect them. You can use Webmaster Tools to check for any issues which may be present.
These are some of the most common reasons why web pages are not indexed.
You can see why it’s so important that each of your webpages is indexed. Using the URL inspection tool enables you to see if this is the case, and to identify problems. The tool has been released as the beta stage and will be rolled out to Search Console users over the coming months. As soon as you are able to access the tool, it’s a good idea to do so. You can keep check on the crawling and indexing of all of your web pages and help to ensure that you feature as highly as possible in search engine results.