Every website owner that is seriously invested in their online property and presence should be aware of the ranking factors Google uses for SEO. As of this writing, Google has updated their algorithm in ways that have caused major changes in ranking. In fact, many sites are still adjusting their strategies. As recently as June 25th-26th, Google has continued to make major changes.
Way back in time when the internet was first coming into use, one would surf for hours clicking links hoping you would land on a website and it’s pages that would answer your questions. During this time Google started as a search engine with the hope to solve that problem. The goal was to look at links to simplify things by grouping like content.
The idea behind this is to look at links are a “vote” for your sites trustworthiness. As an illustration, think about those you know professionally that you would call for business advice. Now imagine each person as a webpage and their knowledge as a link. That link to wisdom is what we look for online. If the person isn’t trustworthy, then you’ll likely not ever ask their opinion again. So you can see, links are important on your site to retain users.
Now, here are a few things to be aware of for your websites SEO:
Domain Authority (DA)
This is how Google sees your site’s (www.mysite.com) trustworthiness across all pages and articles you post. It’s important to not post about everything. Instead, just post what you know and what people are looking for. Attempts to manipulate your audience just to increase your DA could backfire. Don’t try to grab authority by randomly using current trending terms just to capture site traffic. It will hurt you in the long run. Remember Google’s slogan is “Don’t be evil” meaning users first, not site owners.
Page Authority (PA)
This is how trustworthy a page is on your site. You can pass PA along to another page via redirect. Still, be careful with this approach. Passing a page around on your site too many times looks suspicious to the search engine.
Sitemaps tell Google what pages you want it to crawl. It’s not important to submit a sitemap multiple times. Or in some cases, with regularly crawled sites, no submission is needed at all. For the majority of websites, sitemaps are necessary, and should be properly structured. If you are using a CMS, you want a sitemap plugin that can continue to update it. That way when the Google bot makes it’s next rounds on your site, it picks up the new pages.
No one likes a slow poke, and neither does Google. But with sites today, they are growing in size at rates that make some wonder if the hardware can keep up. Websites used to be less than a few Kilobytes (3-5kb). Now they are pushing 4 to 6 Megabytes (4-6MB). With larger sites comes more resources to serve the pages up faster. Since the internet provides us a wider audience, its important to remember a server in Ohio can’t serve a page as fast to users in California, or overseas. You’ll have to increase your hosting costs, and look at your website’s development, to squeeze as much speed out of it you can. Ultimately it comes down to time, and unfortunately, money.
Internal links are simply links on your webpage that can guide users to other parts of your site like related pages or posts. They also can take users off your site to where you referenced your content to other sites. This isn’t bad, if used in proper moderation. However, if you link to places that are not relevant to the page, post or even your websites topic, it can create a penalty. Keeping this in check will save you frustration from users, and thus make Google happy to give your site a more prominent position.
This is one that you have heard of if you have been involved in your website’s SEO efforts. It’s been turned into a buzzword quote often to “sell” services. In truth, it’s a major ranking factor because each link from a reputable site, is a vote “yes” in favor of you being a trustworthy online source. Talk to our marketing team about SEO and see if you can develop a strategy for this. It takes time, but worth it.
TF-IDF (Term Frequency-Inverse Document Frequency) is a text mining technique used to categorize documents. You’ve heard “keyword density” used a lot over the past couple of years when engaging in SEO services. This takes that concept to a new level as it is a two part measurement. While density is a valid metric, around 5-7% as of this article, the frequency of used across all pages matter. The abbreviation TF stands for “term frequency.”
- TF(t) = (Number of times term t appears in a document) / (Total number of terms in the document)
- IDF(t) = log_e(Total number of documents / Number of documents with term t in it).
- Value = TF * IDF
Yeah it’s very technical and I’m not doing it justice probably, but you can more here if you want to dig in more.
Content quality is something we’ve blogged about on here a lot of the last few years. Google has made amazing strides from the days of just categorizing pages with links. Webpages are no longer static and simply scanned and served up on the SERP (Search Engine Results Page). The entire site is scanned from code to character.
If you are writing content, that is awesome! However, if you are not really trying, then stop and really think about who is your audience. What do they want to read? Are you are talking over them or to them? With quality also brings relevancy. The content must stay on point, and be what the user clicked on from the SERP. Don’t make your users angry. You won’t like it when they get angry.
Page Titles (SERP) & Descriptions
Page titles are pretty self explanatory, but a lot of site owners miss the importance or care it requires to name them. In the SERP, it’s the title that is in blue at the top of the result. The primary keyword is in front followed by a pipe “|” to tell the Search Engine that what follows is the secondary keyword. However, if these words are not on the page, Google will re-write these.
Recommended for You
Descriptions are below the URL of the page. There’s always constant debate about the length of these. But in the mobile search age, it depends on the viewport (screen size). If you are interested into diving in more on what makes a good snippet, and how Google has recently updated them, check out the article here.
Robots Text File
In your servers main directory is a robot.txt file. A basic file should look like:
User-agent: [user-agent name]
Disallow: [URL string not to be crawled]
This tells the Google bots to absolutely not index or craw these areas of your site. You can do advanced things with file and test whether or not Google can see it in your Webmaster tools account.
404, 301 and 403’s oh my
I’m sure a 404 error has been heard of before by those watching their visibility on search. A 404 is a resource not found, and the search engine will continue to fetch this, until eventually it stops and drops. This is not a good strategy, therefore, 404’s should be either re-directed as a 301 (moved permanently) or a new sitemap submitted to see if that resolves the errors.
Anything else should be address as soon as possible by your webmaster as they effect your websites ranking and visibility to users.
We hope this article was helpful if you’ve been recently hit by the latest Google update as of June 26th 2017. Even though Google’s John Mueller’s saying there’s been nothing, I don’t think other SEO experts are buying it.
There’s nothing confirmed from our side, this is all just random chatter. ¯_(ツ)_/¯
— John ☆.o(≧▽≦)o.☆ (@JohnMu) June 28, 2017