A new study has found that the structure of many blogs could be having a detrimental effect on their search engine rankings. The main issue is due to link depth, which is the average number of clicks from the homepage of the blog to the relevant content. The study by Perficient Digital found that two thirds of websites have a link depth higher than five. This is likely to have a negative effect on the PageRank and ability of search engine bots to crawl the content. Shockingly, the study found that – ‘31.5% of the posts were 21 clicks or more from their respective home pages,’ ‘9.5% of the posts were 50 or more clicks away’ and some posts even exceeded 1,000 clicks away.
The problem with high link depth
The main problem with having a high link depth is that your content will essentially be buried below multiple other links. This in turn, sends a strong message to the search engines that your content may not be relevant to the users query and as a result, reduces the websites search spider crawl budget. What’s more, if one page in the chain of links has an error, then the crawler will be unable to access any of the pages further down the link chain. Therefore your content may not be effectively crawled for long periods of time. Keep in mind that your link depth should ideally be between 3 and 5, depending on your website’s size and structure. You can easily check your average link depth with the help of online SEO analytics tools.
How to identify your link depth culprits
A number of factors may be affecting your link depth. Here are four of the most common culprits to be aware of:
1. Pagination
While pagination may be an essential tool for improving user experience and reducing server load time, content on your website that includes lengthy paginated lists can actually cause significant problems for SEO. For example, if there are 40 pages in the list but the highest page you can click to go to automatically is page 6, then you would then have to make many clicks to reach higher pages. Pagination can also be a catalyst for duplicate content issues, seeing search engine crawlers waste precious resources on indexing poor quality content.
2. Faceted Navigation
Filters implemented on large sites can help users narrow down their search results to find a desired product or content. However often construed as a SEO nightmare, a navigation strategy with poor filter combinations can create new pages at volume. This can quickly increase link depth, swallow up crawl budget and dilute link equity. It is advised that website developers should avoid using multiple filters simultaneously (two at most) to maximise traffic potential and to limit the URL being changed. Be aware that there is no ‘one-size-fits-all’ strategy and you may have to manually create indexable landing pages for some of these key filter combinations.
3. Tracking parameters in URLs
Tracking parameters are used to deliver useful information about the source of a click through the site’s respective URL. Parameters can be derived from a huge range of combinations and can quickly generate URL variations en masse. Many low value URLs that display similar if not the same content, can lead to keyword canibalisation and impact site quality. Parameters are also a detriment to page performance as they often look less clickable and trustworthy in form.
4. Broken URLs
Broken links on a website can appear on an internal and external basis. Internal links on-page often return a 404 response as the page the link is leading to may no longer exist. Broken external links take users from your own website to a non existent page. Both types of malformations will impact user experience and likely diminish your click through rates. If these issues are not resolved quickly, Google will identify the deterioration of the website and reflect this in your search rankings.
Ways to amend your blog content structure
Fortunately, there are some simple ways to amend your blog content structure and reduce your link depth. Here are some key actions to consider:
Add internal links to valuable content
Determine what content on your site is most valuable to you and most importantly to your audience. Adding plenty of internal links – such as navigational links and contextual links – will help demonstrate to search engines that this content has value and should be seen by users. As a general rule of thumb, the optimal internal linking structure follows a pyramid hierarchy. See here for a trusted guide to internal linking.
Systemise your site structure
Ease of user experience and display of original, relevant content should be at the core of your website’s ambitions. Cleaning up your blogs’ structure doesn’t have to be an overly technical process and can be achieved through numerous strategies. Succinct site menus, functional categories, tags and keeping content updated and in line with Google algorithm updates are a couple of examples that come to mind.
Rethink your content strategy
If you are new to the SEO scene, prioritise quality content over quantity. Make sure you check your old content and either revive it or remove anything that is outdated, low quality, or no longer relevant. However, before deleting any content, be sure to perform rigorous audits so as not to upset your rankings in the SERP. Auditing your content will help you evaluate what adds value to your site and what is harming it. When refining your website, consider whether it meets Google’s E-A-T requirements and aligns with new algorithm updates before making any changes.
Final thought
Businesses should be aware that their blog structure can have a significant impact on their link depth and overall SEO ranking. It is important that all businesses take action to effectively improve their websites in order to ensure that their web pages are effectively crawled by search engine bots and are accessible to users. Use the tips above to help identify and improve any issues with your link depth or alternatively speak to us for specialist advice.