By collecting data from 100,000 websites, a total of 450,000 pages with its tools, SEMRUSH prioritized the most widespread onsite technical errors and their (nightmarish) consequences on SEO.
Errors (or limiting factors) are classified from 1 to 5 according to their level of severity, a short comment accompanies each criterion.
I offer you this infographic in French version and remind you of this article which deals with the best practices of on-page optimization.
A free translation of K.Maret SEO Project Manager Montreal
The ability of the site to be crawled by bots: crawlability
- Crawlable: Almost 17% of sites are misconfigured
- Chain redirects: This type of fault affects 5.49% of the sites analyzed in the study.
Internal links and redirects
The problems of internal linking and redirects are quite complex. Some penalize the site in a serious way, others are not really mistakes. Verify that your main pages are accessible to users and do not redirect to a 404 or would be unplanned 301/302. It seems that one site out of 4 has redirect errors.
Broken internal links
33% of the sites in this super audit are affected in some way
- Temporary redirects
- Internal links in nofollow
- Broken external links
- External links in nofollow
- Excess of external links
Sitemap.xml
The heart and soul of your website deserves the best attention.
80% of website owners know this and take it into account. The sitemap.xml guides bots through your site and shows them the way to your essential pages. Configure the sitemap.xml correctly and submit it to Google console, Bing webmaster tools , and other search engines.
The problems encountered affect
- Format errors
- Page display errors.
- Sitemap not found
- Forgotten files
txt robots
Errors in the robots.txt can impact the indexing of the site. Pages in “noindex”, which you do not want to be indexed, could be taken into account by crawlers.
Structure des URL
A good site architecture and clear URLs of reasonable sizes are more meaningful for both bots and users.
SEO-friendly URLs can have a big impact on user experience and ranking
Onpage Optimization
The content
Website owners believe that not all web portal pages ask/deserve perfect SEO writing. They often consider this criterion minor. However, constant attention must be paid to it.
In addition
A text/code ratio that is too low can be considered as a negative indicator of the quality of the code, while the size of the texts in the top 10 is growing every year and is tending towards 1500 words!
Think about structuring your texts for SEO and writing “search engine oriented” while respecting the fundamentals of web writing.
The Semrush study notes very high proportions of errors on the
- Duplicate content
- Coded Text Ratio
- Low content
Meta description
Search engines don’t see this meta as a problem because it’s not real SEO data. This is why we see many sites displaying banal and poorly worked descriptions. However, it is more efficient to decide for yourself what the meta description returns in the snippet rather than leaving this choice free to search engines.
Title <tag>
The study shows that webmasters do not give enough importance to this essential tag. However, it is in control at the moment when the Internet user decides to click on a result of the SERP.
The nonsense pointed out is
- Missing tag
- Title too long
- Duplicate Title
- Title too short
H1 Tag
The h1 is an important tag that authors still neglect too much (according to the study), yet it is the one that defines the context of the page.
Here again, too many pages show this type of defect
- No h1
- A duplicated h1
- Of multiple h1
Image
Many websites lack images or don’t optimize them… Too bad on the impact in google image and the display speed, not to mention the bounce rate caused by an image that doesn’t display quickly. Think about it or read this article onimage optimization.
- Broken image link
- Missing the alt attribute
Technical aspects of the site that concern SEO
Display speed
Who wants the Internet user to fall asleep in front of the screen, while your pages load?
If your display speed is not good, you will also lose on the ranking dedicated to mobile devices
- Slow display
- Excess code, especially scripting
Outdated technologies
Engines don’t like technologies that are obsolete or that haven’t evolved. This poses problems with indexing, missing or incomplete codes,
Updating your site is essential.
- Forget frames, flash, and declare your doctype
Mobile
Mobile search is exploding, and the study finds that websites have updated and have few problems on this side. On the other hand, we are not talking about the AMP versions… perhaps the next must-have of the algorithm.
AMP versions should make sure to declare their canonicals and configure the Viewport correctly, your mobile site will be all the better for it.
Original version on the Semrush website