Does it feel like you’ve done everything right, yet your website is still nowhere to be seen in Google’s search results?
The bad news: Several things could be preventing you from showing up in Google.
The good news: Many of them are easy to fix.
Below, we explore nine possible reasons why you’re not showing up in Google and how to fix each issue.
Before we start…
It’s important to note that when you type something into Google hoping to see your website in the search results, you’re not actually looking for your website.
You’re looking for a page on your website.
That’s an important distinction.
If Google doesn’t know about the existence of the page you’re trying to rank or thinks it doesn’t deserve to rank, then it won’t show up anywhere that matters in the search results.
Your homepage may be the page you’re trying to rank.
For that reason, to show up in Google, three things need to be true:
- Google knows that your website exists and can find and access all your important pages.
- You have a page that’s a relevant result for the keyword you want to show up for.
- You’ve demonstrated to Google that your page is worthy of ranking for your target search query—more so than any other page from another website.
Most of the issues we tackle below relate to one of these three things.
Let’s start with the simple stuff…
1. Your website is too new
It takes time for Google to discover new websites and web pages. If you only launched your site this morning, then the most straightforward explanation is that Google just hasn’t found it yet.
To check whether Google knows your website exists, run a search for
If there is at least one result, then Google knows about your website.
If there are no results, then they don’t.
But even if they know about your website, they might not know about the page you’re trying to rank. Check that they know about this by searching for
There should be one result.
If you see no results for either of these searches, create a sitemap, and submit it via Google Search Console. (It’s good practice to do this regardless.)
Search Console > Sitemaps > Enter sitemap URL > Submit
You’ll need to create a free Search Console account and add your website before doing this. Read this guide for instructions.
A sitemap tells Google which pages are important on your site and where to find them.
It can also speed up the discovery process.
Can’t find your sitemap?
Go to yourwebsite.com/sitemap.xml. If there’s nothing there, go to yourwebsite.com/robots.txt as this often lists the sitemap URL.
Still nothing? You might not have one. Read this.
2. You’re blocking search engines from indexing your pages
If you tell Google not to show certain pages in the search results, then it won’t.
You do that with a “noindex” meta tag, which is a piece of HTML code that looks like this:
<meta name="robots" content="noindex"/>
Pages with that code won’t be indexed, even if you created a sitemap and submitted it in Google Search Console.
You probably don’t recall ever adding that code to any of your pages, but that doesn’t mean it isn’t there.
For example, WordPress adds it to every page if you check the wrong box when setting up your site.
It’s also something that a lot of web developers use to prevent Google from indexing a site during the development process and forget to remove it before publishing.
If Google has already crawled the pages in your sitemap, it’ll tell you about any “noindexed” ones in the “Coverage” report in Google Search Console.
Just look for this error:
If you recently submitted your sitemap to Google and they haven’t crawled the pages yet, run a crawl in Ahrefs Site Audit. This checks every page on your site for 100+ potential SEO issues, including the presence of “noindex” tags.
Remove “noindex” tags from any pages that shouldn’t have them.
3. You’re blocking search engines from crawling your pages
Most websites have something called a robots.txt file. This instructs search engines where they can and can’t go on your website.
Google can’t crawl URLs blocked in your robots.txt file, which usually results in them not showing up in search results.
If you’ve submitted your sitemap via Google Search Console, it should alert you about issues related to this. Go to the “Coverage” report and look for “Submitted URL blocked by robots.txt” errors.
Once again, that only works if Google has already attempted to crawl the URLs in your sitemap. If you only recently submitted this, then that may not yet be the case.
If you prefer not to wait, you can check manually. Just head to yourdomain.com/robots.txt.
You should see a file like this:
If you get a 404 error, then it means you don’t have a robots.txt file. Learn how to create one here.
What you don’t want to see here is this piece of code…
.… under any of these user-agents:
Why? Because it blocks Google from crawling all the pages on your site.
You also don’t want to see a “Disallow” directive for any important content.
For example, this Disallow rule would prevent Google from crawling all the posts on our blog.
Remove any directives blocking content that you want to show up on Google.
Robots.txt files can be complicated, and they’re easy to mess up. If you feel that yours may be preventing pages from showing up on Google, and you don’t know much about this file, hire an expert to fix it.
4. You don’t have enough high-quality backlinks
Even if nothing is stopping Google from finding your page, you still need to “prove” to them that it deserves to rank.
While there are hundreds of factors at play in Google’s algorithm, the number of backlinks from unique websites to a page seems to be a strong one. We’ve found this time and time again in correlation studies.
If the web pages ranking above you have way more backlinks, then this could be part of the reason you’re not showing up in Google.
To see the number of unique websites (referring domains) linking to your page, paste your URL into Site Explorer or our free backlink checker.
a quick reminder…
Google ranks web pages, not websites. While it may be the case that you want to rank your homepage for a specific keyword, it’s important to make sure to look at the number of referring domains to that page, not your site as a whole.
Next, go to Keywords Explorer, search for your target keyword, then scroll down to the SERP overview. Here, you’ll see the current top-ranking pages and SEO metrics for each of them.
Skim the “Domains” column to see how many unique websites link to each page.
Consider building more backlinks if your page falls short.
5. Your page is lacking “authority”
Google’s ranking algorithm is based on something called PageRank, which essentially counts backlinks and internal links as votes.
Some SEOs see PageRank as old news, but Google confirmed that it was still a critical factor in their ranking algorithm in 2017:
DYK that after 18 years we’re still using PageRank (and 100s of other signals) in ranking?
Unfortunately, Google discontinued their public PageRank scores a few years ago. Now, it’s not possible to see how the PR of your page stacks up against top-ranking pages.
Luckily, we at Ahrefs have a metric based on similar principles called URL Rating (UR).
Like PageRank, this takes into account internal links and backlinks, and we’ve found that it correlates with search traffic.
UR runs on a scale from 0–100. High UR pages have more “authority” than low UR pages.
To check the URL Rating of any page on your site, paste the URL into Site Explorer or our free backlink checker.
Compare that to the UR of the top-ranking pages for your target keyword using the “SERP overview” in Keywords Explorer.
If the top-ranking pages have a much higher UR score than yours, it might be a sign that your lack of “link authority” is holding you back.
There are two ways to boost the authority of a web page:
- Build more backlinks;
- Add more internal links.
Generally speaking, the former is harder than the latter—especially if you want to rank a sales page.
For that reason, adding some relevant internal links to your page is often the best starting point.
6. Your website is lacking “authority”
Google continues to give mixed signals about whether site authority is a ranking factor.
In this tweet, Google’s Gary Ilyes says there’s no such thing:
we don’t really have “overall domain authority”. A text link with anchor text is better though— Gary “鯨理” Illyes (@methode) October 27, 2016
But in this interview, Google’s John Mueller says they have metrics that “map into similar things.”
We studied the relationship between rankings and Domain Rating (our website authority metric) and found a small positive correlation between the two:
The correlation here is much weaker than that of referring domains or page-level authority. Also, correlation ≠ causation.
That said, it seems that site authority plays a more significant role in the rankings for some keywords than others.
Take a look at the top-ranking pages for keyword “designer dresses”:
The average and median DR of the top 5 pages is 82, and the weakest site has a DR of 77.
Given that DR runs on a scale from 0–100, this is an extremely high average. A less authoritative site would probably struggle to rank for this keyword.
The fact that some queries have only high-authority pages ranking in the top 10 isn’t necessarily proof that “website authority” is a ranking factor. It might be that for certain queries, Google knows people want results from trustworthy, well-known brands.
It’s more of a mixed bag for the keyword, “best coffee machine.”
There are pages from both high and low-authority sites in the top 5.
To check the “authority” of your website, paste the domain into Ahrefs Site Explorer on our free website authority checker.
Compare this to the DR scores of top-ranking sites for your target keyword. That should give you a good idea as to whether the authority of your website may be preventing you from ranking for your desired keywords in Google.
Even if some pages above you are from high DR sites, you may still be able to outrank them. How? By building more backlinks and “link authority” at the page level. After all, Google ranks web pages, not websites.
You can see an example of this for the keyword “bitcoin mining calculator.”
The page ranking in position #4 is on a lower DR site than most that it outranks.
One of the reasons the page is seemingly able to outrank them is because it has a high number of backlinks from unique websites and more “authority.”
7. Your web page doesn’t align with “search intent”
Google aims to rank the most useful and relevant results for each query.
That’s why it’s essential to align your content with what searchers expect and want to see. This is known as search intent.
Let’s say that you’re American Express and you want this page to show up in Google for the term “best credit card”:
If we look at the top-ranking results for this term in Keywords Explorer, these are the metrics for the strongest page in the top five:
- Domain Rating: 86
- URL Rating: 49
- Referring domains: 466
Looking in Site Explorer, it’s clear that American Express’s page beats this page (and all other top-ranking results) on those fronts. But still, it doesn’t even rank in the top 100 results.
Why? Because the page doesn’t align with what searchers want to see.
If we look more closely at the top-ranking pages for “best credit card,” we see that they’re all lists of the best cards from different banks and providers. Like this one:
The page from American Express is more of a sales page.
Google knows people don’t want a sales page when searching for this query, so it doesn’t rank them.
8. You have duplicate content issues
Duplicate content is when the same or similar web page is accessible at different URLs.
Google tends not to index duplicate content because it takes up unnecessary space in their index—a bit like having two copies of the same book on your bookshelf.
Instead, it usually only indexes the version that you set as the canonical.
If no canonical is set, Google attempts to identify the best version of the page to index itself.
Unfortunately, Google’s ability to identify duplicate pages without non-self-referencing canonicals isn’t perfect.
Take a look at these two pages on Buffer.com:
Both are indexed, despite them being almost identical.
This causes issues because the “authority” of one page is split between two URLs. The first URL has backlinks from 115 referring domains (unique websites), and the second links from 140.
Given that one of these page’s ranks in position #22 for its target keyword, it would probably rank higher if the “backlink authority” from both URLs were consolidated.
To find duplicate content issues on your website, run a crawl using Ahrefs Site Audit, then go to the “Content quality” report. Look for clusters of duplicate and near-duplicate pages without canonicals.
Fix these issues by redirecting or canonicalizing the duplicates.
9. You have a Google penalty
Having a Google penalty is the least likely reason for not showing up on Google. But it is a possibility.
There are two types of Google penalties.
- Manual: This is when Google takes action to remove or demote your site in the search results. It happens when a Google employee manually reviews your website and finds that it doesn’t comply with their Webmaster Guidelines.
- Algorithmic: This is when Google’s algorithm suppresses your website or a web page in the search results due to quality issues. It’s more a case of computer says no than human says no.
To be accurate, algorithmic penalties aren’t penalties, they’re filters.
Luckily, manual penalties are extremely rare. You’re unlikely to get one unless you’ve done something drastically wrong. Google also usually alerts you about them via the “Manual penalties” tab in Search Console.
If there’s no warning in there, then you probably don’t have a manual penalty.
Unfortunately, Google doesn’t tell you if your site is being filtered algorithmically—and this can be quite challenging to identify.
If you suspect an algorithmic penalty due to a recent significant drop in organic traffic, your first course of action should be to check whether that drop coincided with a known or suspected Google algorithm update.
Panguin is a useful tool for this. It shows known algorithm change dates over your Google Analytics traffic to make it easy to spot issues.
If you still suspect your site has been filtered or penalized at this point, talk to an expert before taking any potentially catastrophic actions like disavowing links.
Ranking in Google is like playing a video game.
If you have technical issues like a broken controller, then you’re never going to win no matter how hard you try. It’s the same with your website. You need to fix severe technical problems like rogue “noindex” meta tags and crawl blocks before playing the game.
From there, it’s critical to understand the level you’re playing at and the strength of your opponents. Some levels are easy because your opponents are weak. Others are difficult because they’re strong.
You may need to level-up by building backlinks and “authority” before taking on stronger ones.
If you’re struggling to beat your opponents because they’re too strong, play an easier level.
You can do this by targeting lower-competition keywords. Of course, one way to find these is by filtering for low-difficulty keywords in Keywords Explorer.
Just remember that it’s often worth making an effort to complete difficult levels because unlocking that achievement can be a game-changer for your business.
Got questions? Let me know in the comments or on Twitter.
Join To Our Newsletter
You are welcome