How to improve a website’s Google ranking


Search Engine Optimization (SEO) is becoming more and more important to every site.  Historically, some of the techniques used have given SEO a bad name, but there are several simple ways to ensure your site benefits from natural search results.

Search can be classified as either organic (natural) or paid. Organic (natural) results are those that occur naturally in search engine results pages (SERPS) and high results depend on both the technical construction of your site and the content within it.

Paid results, often referred to as Pay Per Click or PPC, are the results site owners pay for and which usually surround the organic listings.

Research shows web users prefer organic listings to paid listings, considering them more relevant and trustworthy.

The goal of SEO, then, is to improve your organic listings performance, which in turn should boost traffic to your site.

Organic results

Search engines index the web using large clusters of computers, known as bots, which spider the web by following links found on web pages. These URLS are populated into the search engine indexes and it’s this index that’s queried every time a user performs a search.

Search engines employ complex mathematical equations, known as ranking algorithms, to order search results. Google’s algorithm alone relies on more than 200 individual factors to decide which result, in which order, to return to its web searchers.

Organic SEO can be further split into two categories: On-page: The code and content you use to manage and deliver your web pages. Off-page: External factors effecting SEO.

This is primarily focused around link building – getting other websites to link to your content. Here we’ll focus on on-page optimization methods, which are all under your control.

The most important thing is to maximize accessibility to ensure search engines can find all your content. There are two ways to get discovered by search engines.

One is to submit your site directly to their index (Google; Yahoo; Bing). The other is to wait for them to find it through links to you from other sites during their crawling process. For more information on Google’s crawling process, see this page.

Google url submission

To make sure your website is accessible to search engine spiders, follow these simple steps:

1. Ensure you’re not preventing the search engines indexing your site via The Robots Exclusion Protocol with use of a robots.txt file, which is used to give instructions to search engine bots. More information on this can be found here.

2. Ensure your content is machine-readable. Avoid using Flash, video or imagery to exclusively house your content. Remember, search spiders cannot see images or video: they can only read written text on a web page.

3. Ensure you have a clear internal linking architecture. Promote important content to the homepage and link to key site sections via dedicated navigation. Group content into clear site sections reflected in your site navigation to aid both users and search engines. For example:

www.mysite.com/ /news /products /category-1 /category-2 /blog /about /contact

4. Eliminate duplicate content. This could be caused by the way your server is set up or how your CMS serves up content. Either way, this needs to be addressed. We’ll cover how to fix the most common duplicate content issues later on.

5. Ensure you’re targeting the appropriate keywords for your business objectives. Just as successful advertising campaigns contain content that appeals to a target demographic, successful websites need to focus on keywords that have the highest relevance to their target audience.

Google Sitemaps

You can help Google out by providing a sitemap file (called sitemap.xml) that lists all of your content and how often it’s updated. Visit the Webmasters webpage to provide Google with this information.

Duplicate pages are a bad thing, and making pages that specifically detect the Googlebot (Google’s web-crawling tool) and serve up something designed for it is an absolute no-no.

If you’re using a CMS such as WordPress, it’s worth having it create static versions of pages where possible. It’s not crucial – Google can handle dynamic pages these days – but it doesn’t hurt.

This text version makes it clear what’s being used to figure out your site’s content.

Where possible, it’s also preferable to have permalinks like ‘/products/fridges/dynatech-coolfreezepro/’ for pages, rather than addresses that end with things like ‘?page=42132’. Every scrap of data you give the crawler will boost your chances in search results.

As far as raw content goes, the most important thing is that your site uses the keywords that people search for.

Visit the Webmasters webpage again and request a copy of the data that Google’s search engine uses, then examine it. If your site doesn’t use the phrase ‘humane spider catcher’, Google won’t direct people looking for those words to your site.

That sounds obvious, but content in frames, videos, pictures, pulled from Twitter or generated on the fly can be left out.

After that, the challenge is to get good links to boost your authority.

SEO explained

These are straightforward tips, but SEO is really pretty simple – the people who claim otherwise have something to sell you. Never be tempted to pay for dodgy tricks.

In most cases they don’t work, will only send worthless traffic rather than actual readers, and could well come back to bite you later on.

Much as exercise and a good diet are the only way to lose weight, good content is the only true way to achieve a good Google rank.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com