Loading...

Blog

Latest blog posts
27 of the Biggest SEO Mistakes Damaging Websites

27 of the Biggest SEO Mistakes Damaging Websites

The fight to keep pace with continual updates and avoid technical site issues is part of daily life for webmasters. 

Even if you are already aware of a number of problems with your website, it can be a struggle to maintain its health in the ever-changing world of SEO.

With a firm grasp of the most common (and potentially harmful) mistakes, though, you can give yourself a fighting chance of keeping technical issues to a minimum and website performance to a maximum. And knowing some of the best SEO practices certainly helps, too. 

This guide gives you a comprehensive site audit checklist that will help you do exactly that as a webmaster, no matter how large or small your site might be.

How We Gathered the Data

We ran 250,000 websites from a range of niches, including health, travel, sports and science, through the Semrush Site Audit tool to find the most prevalent SEO mistakes holding them back. 

In total, we analyzed:

310,161,067 webpages

28,561,137,301 links

6,910,489,415 images

This breadth of analysis gave us enough insight to create a comprehensive site audit template that webmasters can use to avoid the mistakes themselves.

Creating a Site Audit Template Backed by Research

There is no escaping the fact that a properly conducted site audit is a time-consuming task.

Our study revealed 27 common mistakes that obviously can’t be completed at once, so we have broken the list down into digestible chunks to use as an actionable template. 

Ignoring HTTP Status and Server Issues

Seo service denver 6

The most critical technical issues with a website are often related to its HTTP status. 

These include status codes like Error 404 (Page not found), which indicate the server’s response on the back of a request from a client, such as a browser or search engine.

When the dialogue between a client and a server — or, in simpler terms, a user and your website — gets interrupted and breaks down, so too does the trust the user has in the site. 

Serious server issues may not only lead to lost traffic because of inaccessible content, but they may also damage your rankings in the long run if they leave Google unable to find any suitable results on your site for the searcher. 

Mistakes Affecting Your HTTP Status:

1. 4xx errors4xx codes mean that a page is broken and cannot be reached. They can also apply to working pages when something is blocking them from being crawled.

2. Pages not crawledThis occurs when a page cannot be reached for one of two reasons: 1) the response time of your website is over five seconds; or 2) your server denied access to the page. 

3. Broken internal linksThese are links that lead users to pages to a non-functioning page, which can damage UX and SEO.

4. Broken external linksThese are links that lead users to pages that don’t exist on another site, which sends negative signals to search engines.

5. Broken internal imagesThis is flagged when a picture file no longer exists, or its URL is misspelled. 

Other common HTTP status mistakes include:

Permanent redirects Temporary redirects

Under-optimizing Meta Tags

Seo services in kolkata 6

Your meta tags help search engines identify the subject matters of your pages to connect them with the keywords and phrases used by searchers. 

Creating the right title tags means choosing the relevant keywords to form a unique and click-worthy link for users in the search engine results pages (SERPs). 

The meta descriptions give you additional opportunities to include keywords and related phrases. 

They should be as unique and tailored as possible — if you don’t create your own, Google will automatically generate them based on the keywords in users’ queries, which can sometimes lead to mismatched search terms and associated results. 

Optimized title tags and meta descriptions need to include the most appropriate keywords, be the correct length and avoid duplication as much as possible.

Some industries, such as ecommerce fashion, are unable to create unique descriptions for every single product, so they need to offer unique value in other areas of their landing pages’ body copy. 

If unique meta data are possible, though, you should head in that direction to give your site the best chance of maximizing its impact in the SERPs.

The Most Common Meta Tag Mistakes that May Hurt Your Rankings:

6. Duplicate title tags and meta descriptionsTwo or more pages with the same titles and descriptions make it difficult for search engines to properly determine relevance and, in turn, rankings.

7. Missing H1 tagsH1 tags help search engines determine the topic of your content. If they are missing, there will be gaps in Google’s understanding of your website. 

8. Missing meta descriptionsWell-written meta descriptions help Google understand relevance and encourage users to click on your result. If they are missing, click-through rates can fall.

9. Missing ALT attributesALT attributes provide search engines and visually impaired people with descriptions of the images in your content. Without them, relevance is lost and engagement can suffer. 

10. Duplicate H1 tags and title tagsWhen H1 tags and title tags are the same on any given page, it can look over-optimized and it can mean opportunities to rank for other relevant keywords have been missed.

Other common meta tag mistakes include:

Short / long title elements Multiple H1 tags

Creating Duplicate Content

Google maps seo services 6

Duplicate content has the capacity to damage your rankings — and potentially for a while.

You should steer clear of duplicating any kind of content from any kind of site out there, whether they are a direct competitor or not. 

Look out for duplicate descriptions, paragraphs and entire sections of copy, duplicate H1 tags across multiple pages and URL issues, such as www and non-www versions of the same page.

Pay attention to the uniqueness of every detail to make sure a page is not only rankable in Google’s eyes, but also clickable in users’ eyes. 

The Most Common Duplication Issues that Hold Sites Back:

11. Duplicate contentThe Site Audit tool flags duplicate content when pages on your website have the same URL or copy, for instance. It can be resolved by adding a rel=“canonical” link to one of the duplicates, or using a 301 redirect.

Other common duplication mistakes include:

Duplicate H1 tags and title tags Duplicate meta descriptions

Neglecting Internal and External Link Optimization

Myrtle beach seo services 6

The links that guide your visitors in and out of your customer journeys can damage your overall user experience and, in turn, your search performance. Google simply will not rank sites that deliver a poor user experience.

This study revealed that close to half of the sites we ran through the Site Audit tool have problems with both internal and external links, which would suggest that their individual link architectures are not optimized. 

Some of the links themselves have underscores in the URLs, contain nofollow attributes, and are HTTP instead of HTTPS — this can impact rankings. 

You can find broken links on your site with the Site Audit tool; the next step would be for you to identify which ones are having the biggest effect on your user engagement levels and to fix them in order of priority.

The Most Common Linking Issues that May Impact Your Rankings:

12. Links that lead to HTTP pages on an HTTPS siteLinks to old HTTP pages may cause an unsafe dialog between users and a server, so be sure to check that all your links are up to date.

13. URLs containing underscoresSearch engines may misinterpret underscores and incorrectly document your site index. Stick to using hyphens instead.

Other common linking mistakes include:

Broken internal links Broken external links Nofollow attributes in external links Pages with only one internal link Page Crawl Depths of more than 3 clicks

Making Things Difficult for Crawlers

Organic seo services company 6

Crawlability sits alongside indexation issues as one of the crucial health indicators of a website.

There is ground to be both lost and gained in the SERPs when it comes to the crawlability of your site. 

If you ignore any crawling issues from a technical SEO perspective, some of the pages on your site might not be as visible as they should be to Google. 

If you fix any crawling issues, however, Google will be more likely to identify the right links for the right users in the SERPs. 

You can avoid technical issues by assessing your site for broken or blocked elements that restrict its crawlability. 

Kevin Indig, VP SEO & Content at G2dotcom, emphasizes the importance of synergy between sitemaps and robots here:

What surprised me is that many XML sitemaps are not referenced in the robots.txt. That seems like a standard to me. What’s not surprising is the high degree of sites with only one internal link to pages or even orphaned pages. That’s a classic site structure issue that only SEOs have the awareness for.

An absence of a sitemap.xml file in your robots.txt file, for example, can lead to search engine crawlers misinterpreting your site architecture, as Matt Jones, SEO and CRO Manager at Rise at Seven, says: 

As sitemap.xml files can help search engine crawlers identify and find the URLs that exist across your website, allowing them to crawl these [is] definitely a fantastic way to help search engines gain an in-depth understanding of your website and, in turn, gain higher rankings for more relevant terms.

The Most Common Problems Encountered by Website Crawlers:

14. Nofollow attributes in outgoing internal linksInternal links that contain the nofollow attribute block any potential link equity from flowing through your site.

15. Incorrect pages found in sitemap.xmlYour sitemap.xml should contain no broken pages. Check it for any redirect chains and non-canonical pages and make sure they return a 200 status code.

16. Sitemap.xml not foundMissing sitemaps make it more difficult for search engines to explore, crawl and index the pages of your site. 

17. Sitemap.xml not specified in robots.txtWithout a link to your sitemap.xml in your robots.txt file, search engines will not be able to fully understand the structure of your site.

Other common crawlability mistakes include:

Pages not crawled Broken internal images Broken internal links URLs containing underscores 4xx errors Resources formatted as page links Blocked external resources in robots.txt Nofollow attributes in outgoing external links Blocked from crawling Pages with only one internal link Orphaned sitemap pages Page Crawl Depths more than 3 clicks Temporary redirects

Ignoring Indexability

Seo services south jersey 6

Good indexability indicators are vital for SEO. Put simply, if a page is not indexed, it won’t be seen by a search engine, so it won’t be seen by users either. 

There are many factors that can prevent your website from being indexed, even if you seem to have no issues with crawlability. 

Duplicate meta data and content, for instance, can make it difficult for search engines to identify which pages to rank for certain similar search terms. 

You can see from our research above that almost half of the sites we audited are suffering from indexing issues caused by duplicate title tags, descriptions and body content. 

This may mean that Google is being forced into making decisions about which pages to rank, despite the fact that webmasters can preempt problems like these and tell Google what to do. 

A range of different issues can affect the indexability of your site, from low word count to hreflang gaps or conflicts for multilingual websites.

The Most Common Issues with Un-Indexable Websites:

18. Short / long title tagsTitle tags of over 60 characters are cut short in the SERPs, while those under 60 characters might be missed opportunities for further optimization. 

19. Hreflang conflicts within page source codeMultilingual websites can confuse search engines if the hreflang attribute is in conflict with the source code of any given page. 

20. Issues with incorrect hreflang linksBroken hreflang links can create indexing issues if, for example, relative URLs are used instead of absolute ones: https://yourwebsite/blog/your-article instead of /blog/your-article.

21. Low word countsThe Site Audit tool can flag pages that appear to be lacking in content, so it is worth reviewing these to make sure they are as informative as possible.

22. Missing hreflang and lang attributesThis issue is triggered when a page on a multilingual site is missing the necessary links or tags to tell search engines what to serve users in each region. Find out more about hreflang here.

23. AMP HTML issuesThis issue concerns mobile users of your website and is flagged when the HTML code does not align with AMP standards.

Other common indexability mistakes include: 

Duplicate H1 tags Duplicate content Duplicate title tags Duplicate meta descriptions Missing H1 tags Multiple H1 tags Hreflang language mismatch issues

Forgetting Accelerated Mobile Pages (AMPs)

White label seo services reviews 6

It is vital to gear your on-page SEO towards having a mobile-friendly site. 

We know that mobile-friendliness will become default ranking criteria for both mobile and desktop for Google in September 2020. 

This means that, as webmasters, you need to make sure your site’s HTML code complies with Google’s AMP guidelines before then to be mobile-ready and avoid potential damage to your search performance. 

Check for invalid AMP pages on your site with the Site Audit tool so you can see what needs fixing; it may come down to your HTML, your style and layout or your page templates. 

The Most Common Issue Related to Mobile-Friendliness:

AMP HTML issues can be related to style or layout and, as mentioned above, can affect the indexability of a site. 

Failing to Address Site Performance

Selling seo services 6

Page load time is becoming increasingly important in SEO. The slower your site, the less likely it is to engage the users who have the patience to wait for it to load.

You can get page speed suggestions for mobile and desktop directly from Google. Learn how to measure page speed, and identify opportunities to make your site faster. 

The Google site speed test used in conjunction with the Semrush Site Audit tool might reveal, for example, overcomplicated JavaScript or CSS files (as it did with many of the sites in our study). 

Gerry White, SEO Director at Rise at Seven, suggests that code minifying is a quick win as far as site performance and user experience are concerned:

One of the things that stands out in the data is the amount of quick wins for page speed. It isn’t just about rankings but also about the user and conversion — simple quick wins that can usually be delivered without too much development effort is where I would focus my efforts on that front. Tasks such as compressing JavaScript and CSS take minutes to do, but can make huge improvements on many websites. This should be combined with ensuring that HTTPS is enabled with HTTP2.

The Most Common Issues with Website Performance: 

25. Slow page (HTML) load speedThe time it takes for a page to be fully rendered by a browser should be as short as possible, as speed directly affects your rankings. 

26. Uncached JavaScript and CSS filesThis issue may be tied to your page load speed and happens if browser caching is not specified in the response header. 

27. Unminified JavaScript and CSS filesThis issue is about making JavaScript and CSS smaller. Remove unnecessary lines, comments, and white space to improve page load speed.

Multicategory Issues

In some cases, errors, warnings and notices picked up by the Site Audit tool will fall under several categories. 

This means they can cause a range of problems for your website, as illustrated below, so it is recommended that they are addressed as priorities.

The Importance of Leveraging Site Audit Tools, Tips and Tricks

Committing any of these SEO mistakes can hold your website back from reaching its full potential, so it is vital that you keep on top of them as a webmaster with regular site audits.

Whether you are suffering from crawlability issues preventing pages from being indexed, or duplication issues risking possible penalties, you can use this checklist to stop molehills becoming mountains. 

Make a habit of looking after your SEO and UX health with tools like the Site Audit tool and you will be rewarded with the kind of search visibility and user engagement that have positive impacts on your bottom line.

Download and print our comprehensive PDF poster on Biggest SEO Mistakes to keep the checklist handy.

Innovative SEO services

SEO is a patience game; no secret there. We`ll work with you to develop a Search strategy focused on producing increased traffic rankings in as early as 3-months.

A proven Allinclusive. SEO services for measuring, executing, and optimizing for Search Engine success. We say what we do and do what we say.

Our company as Semrush Agency Partner has designed a search engine optimization service that is both ethical and result-driven. We use the latest tools, strategies, and trends to help you move up in the search engines for the right keywords to get noticed by the right audience.

Today, you can schedule a Discovery call with us about your company needs.



Source: