Common SEO Mistakes and How to Avoid Them


Mistakes happen to everyone, and this includes SEO teams. When you are dealing with massive websites, development changes, CMS issues, scripts, images, and plug-ins/add-ons while trying to integrate continual SEO and content strategies, there are going to be occasional errors. The good news — mistakes help everyone learn, and this applies to the SEO industry. 

So how do we find SEO mistakes?

Technical SEO audits are used often to find errors that negatively impact rankings, conversions, and goals. In last week’s SEMrushchat, we discussed how often to perform audits, common SEO mistakes, how to avoid these mistakes, which technical errors to fix first, duplicate content issues, and best practices for title tags and meta descriptions.

We were very lucky to be joined by two brilliant experts in the SEO industry, Joe Hall and Joe Williams. Hall is the founder and SEO consultant of Hall Analysis with over 10 years of experience helping firms boost their online presence. Williams is the founder of TribeSEO and on a mission to make SEO easier for small businesses. 


How Often Should You Perform Technical SEO Audits?

The first question we tackled with our experts was how often they carried out technical audits of clients’ sites, and why. Both Hall and Williams agreed on yearly audits at a minimum.

Many of my regular clients get audits done at least once a year. Some every 6 months or more especially if they are making a lot of site changes. Its also obviously a good idea to get an audit before or after new site/design launches. Preferably before.

 

Williams succinctly described his approach to technical audits using a car analogy:

Joe Williams

Just like I get my car serviced by a mechanic yearly, I carry out a full technical SEO audit once per year. For urgent issues, I rely on notifications from tools like @SEMrush, Google Search Console and @googleanalytics.

 

Others explained why they perform more frequent technical audits, and other issues we should all keep in mind. 

Bill Slawski ⚓

I would recommend running a technical audit of your site quarterly like a person gets a checkup from their doctor. Keeping an eye on analytics to watch for warning signs is a good idea, but checking under the hood can anticipate problems.

 

JP Sherman

I like to do regular, monthly topical audits.
– January: site speed
– Feb: backlink analysis
– Mar: internal linking
– Apri…
this is, however, generally only useful for large, enterprise sites – but can be modified for smaller sites.

 

Simon Cox

Regularly, depending on the site size and the amount it gets updated. E.G., an ecom site with many new and changed products, I would recommend weekly. Other less busy sites, perhaps quarterly and even yearly.

 

Marianne Sweeny

We do “site assessments,” a light audit at my day job. These are high level and take a couple of hours. I would run one of these monthly to compare over time. If there are no anomalies, full-on, under the hood, nooks and crannies audit should be done yearly.

 

Kat Hammoud

With larger/enterprise clients, I conduct technical audits by L2 quarterly to ensure nothing has gone awry. With smaller clients, yearly or every 6 months should suffice while monitoring the site analytics and indexation performance.

 

Everyone’s situation, time allowance, and budgets are different; it is far easier for a company with a full in-house SEO team to run monthly technical audits than a mom and pop business.

If you are struggling with time and resources, you could always utilize the SEMrush site audit tool and stay informed on a continual basis about issues that arise. Then, every six months to a year, pay an SEO expert to audit everything to ensure your site is up to speed. With all the Google updates and changes, have an expert review your sites is essential.

Common SEO Mistakes and How to Avoid Them

Question two was focused on the most serious technical difficulties experienced with client websites, the impact on performance, and how they can be avoided.

For our SEO community, no indexing technical issues cropped up as a common problem, as well as sites being completely blocked by robots.txt.

Bernie Fussenegger #Digital360Chat

I would say the biggest issue I have had is with Noindexing pages that were live for testing and soft launch and forgetting to remove the tag when approved.

 

Morgan Hennessey

The amount of times a new client has come to us with a site that is entirely blocked by robots.txt 🙊 Nowhere to go but up!!!

 

Joe Williams

New web designs are almost never done on time, so they get rushed out of the door. Search engines get blocked, redirects forgotten, site speed plummets… rankings drop! Prevention is the best fix before, during and after a roll-out.

 

Sam Ruchlewicz

I think the most memorable include: one noindexing all of the pillar content, another insisting on browser-side rendering for Angular JS and a third actually indexing their dev site, which then ranked above their actual slte for branded terms…

 

JP Sherman

Back in agency life, I had a client with his main site no-indexed (no big deal) but his 500 duplicate sites each with 10k pages weren’t.
Did I mention that each page had a 12MB picture of his face?

 

Adam Reaney 🚴🏼‍♂️🏏💻

Noindexing the home page.
No rankings, no traffic, bupkiss.
Removed the noindex, fetched main site hierarchy in GSC and et voila, became an SEO magician in a matter of days.

 

There were some other SEO mistake stories that were shared in the chat as well, and there are lessons to learn from each of them: 

David Gossage

A disgruntled former employee wasn’t removed from Lastpass quick enough so put noindex tags on all our sites. Fun times.

 

Joe Hall 🦡

A large site with 20 million URLs from faceted navigation indexed. Normally not a huge issue, but this site had an odd IA that forced me to develop a conditional logic to de-index the URLs. Then to top it all off, the site was in Portuguese & I only speak English

 

Hamlet

We had a case where there was a session ID in URLs that only happened when JavaScript was enabled. When Google started crawling JavaScript content, this was eating all the crawl budget. We found the issue in the server logs. We filtered the URLs via robots.txt

 

Bill Slawski ⚓

A site had infinite scroll set up in a way that resulted in at least 12 spider traps on their site. The site had a very healthy amount of backlinks, but resolving those loops did seem to result in a boost of ranking.

 

Which Technical SEO Errors Should You Fix First?

When technical audits are done, there is often a lot of errors to fix, but how do you prioritize which technical errors to fix first?

Joe Hall 🦡

High, Medium, and Low based on their impact to rankings, traffic, and their risk level. If I notice s violation of Google’s guidelines, that’s a high priority item.

 

Joe Williams

Indexation issues get addressed first, especially for important pages not showing in Google (Google’s URL Inspection Tool is excellent for diagnosing the why). Then, I focus on site-wide technical issues that also affect top-performing pages.

 

Joe Hall’s idea of ranking them by the level of High, Medium to Low risk is a great place to start. Joe William’s mention of indexation would clearly be in the “high” risk category; the same goes for anything that violates Google guidelines or impacts rankings. Anything in the high-risk category should be addressed as fast as possible. 

Other critical areas our participants mentioned were any errors that impact users and conversions. They also explained that they prioritize problems in terms of impact and time taken to fix an issue:

Marianne Sweeny

Start with the ones that impact macro-conversions, those that generate revenue: traffic, flow through site, content engagement, local visibility, correct language version…whatever makes your client money.

 

Ben Austin

Consider how quickly the issue in question can be fixed, then prioritise by user experience, conversion rate & overall SEO impact!

 

FSE Digital

We prioritise our technical fixes by:
✅Which 1s are most likely to affect traffic & conversions
✅Which issues affect the most important pages
✅What fixes will have the biggest impact vs time spent
✅Time needed – if it will take 6 weeks to fix, start now!

 

Bill Slawski ⚓

Prioritizing recommendations to implement are based on:
1. Most Impact
2. Ease of Implementation
3. Time to Implement

 

How to Fix Duplicate Content 

Duplicate content is a major SEO mistake that comes up again and again, based on a study that SEMrush conducted. In our study, approximately half of the websites experienced the problem. With this in mind, we asked our SEO experts if it was an issue they had faced and what they did to fix it. 

The answer from both was a very clear yes. 

Joe Hall 🦡

Yes, this is an issue that I mention on almost every audit I do. Many times the problems stem from sites that use the same marketing copy on every landing page. The only way to fix it is to push them to create unique **relevant ** content for each page.

 

Joe Williams

Yes, duplicate content is commonplace, especially for ecommerce sites which “share” product descriptions from manufacturers. For these types of pages, more value is needed by creating unique content, encouraging reviews and where relevant, adding videos.

 

Here are some duplicate content insights and solutions provided by our community:

Kat Hammoud

This is a HUGE issue for some sites, especially ones w/siloed teams. Copy/paste the same content across pgs in the same experience due to lack of content & knowledge that this is bad practice. We crawl & identify duplicate content & decide where it should live.

 

Bernie Fussenegger #Digital360Chat

The quarterly audit would be something that would help you understand your content, what it is and where it live. When faced with duplicate content, see what the purpose, type of traffic and develop a plan for removing/redirecting.

 

Youness Bermime

The worst kind of duplication is when the client has an SSL without redirection or preferred site version. Hello site multiplied by 4. Thankfully, a redirect is usually enough.

 

Amar⚡

Most of the time, developers create staging servers that result in duplicate content issues. To overcome this, add robots noindex nofollow meta tag & rules in robots.txt file for duplicate content pages & staging servers. Also, defining canonical URLsolves this.

 

What about a duplicate content penalty? 

Bill Slawski ⚓

Duplicate content isn’t something that can usually cause a penalty, but it can lead to a loss of opportunities – make every page work for you by making it unique and responsive to queries that your audience wants answers to/experiences a need to know.

 

Marianne Sweeny

Duplicate content is not a penalty. If you do not take advantage of declaring: 1) what pages you want the search engine to focus on, 2) declaring which version is the best to visit, then you waste the indexing resources assigned to your site by the search engine.

 

Best Practices for Title Tags and Meta Descriptions

Our last question looked at the best practices for title tags and meta descriptions. In the same SEMrush study we conducted, the results showed that almost a whopping 7 out of 10 sites had issues with missing meta descriptions. 

So, we decided to ask our experts if all their site pages had unique title tags and meta descriptions, and what advice they had for generating them. 

Joe Hall said, “It is SEO best practices to have unique titles & meta tags. Unique meta tags are probably a little less important, but I do think it helps. For title tags, they are so important that you are going to want to do them by hand…unless you have a specific meta tag strategy, you can automate them by leveraging your CMS. For example, many WordPress devs do this:”

<meta name="description" content="<?php echo wp_strip_all_tags( get_the_excerpt(), true ); ?>" />

Joe Hall 🦡

adding that code snippet to your theme will pipe in the first 55 words of a post. This is an easy way to automate your meta description tags.

 

Many felt that unique titles and meta descriptions can be unnecessary and inefficient, especially for large websites, but you can still focus on the pages that are important for reaching your goals:

Michael Ramsey

It’s not technically a best-practice requirement to write meta descriptions for every page. For very large sites this is can be a huge waste of time.
Prioritize writing meta descriptions for the most important pages based on the time you have.

 

Joe Williams

No, not every page on my website has a unique title and meta description tag, BUT every page I want to rank, does 😎

 

Here were some additional insights and advice given by our participants on optimizing title tags and meta descriptions:

Bill Slawski ⚓

Google will select text from your page that may include query terms your page was found in a search for. If you anticipate those searches, you can write meta descriptions that engage & lead to more traffic. Why rank highly if no one selects your page in SERPs?

 

Hamlet

If you need quality meta descriptions at scale, consider using AI/NLP to create abstractive summaries from the content. I wrote a tutorial that should be easy to follow here https://t.co/EoCeg6qINX and @cyberandy wrote another here https://t.co/wgqbOzIj9Z 🤓

 

Danny Conlon

Google often ignores meta descriptions but it’s a best practice and can help with CTR. I make sure they include main keywords (relevancy indicator for both users and Google) and that it’s a good summary of what a user will find on the page.

 

Simon Cox

Only add the meta description if you want to suggest what is shown in the SERP – its a great marketing piece (so do it!). Google will pick out a suitable description based on the user query – might not be your meta either!

 

We would like to take the opportunity to say a big thank you to our SEO experts and our helpful community for participating in last week’s #SEMrushchat and providing their insight on SEO mistakes. 

Do you have some handy tips you would like to share on how to avoid SEO errors? We would love to hear them. Let us know in the comment section below. We hope you will join us tomorrow for SEMrushchat!





Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com