Most bloggers don’t even notice that Google didn’t index their published blog post. Still, other bloggers notice and wonder why Google isn’t adding their page to the index.
Google Index vs. Google Ranking
Google index is the list of all the pages that Google crawls and stores in their cache. The snippets you see when you do a search are from this cached version of the page.
Google keeps its own copy or mirror of each Web site, regularly crawling through links to find new pages to add to the Google index. If your blog is not indexed by Google (or any other search engines for that matter), it means that they didn’t crawl your latest posts yet.
And there could be other challenges as well. Ranking on Google and having a blog post indexed on Google are not the same thing. Your blog post might be in Google’s index, but still not rank in the top 100 pages of Google.
So step one is to get the page indexed. Step two is to get it to rank for a particular query.
This article will help you find the reason why your page isn’t indexing on Google.
Google Hasn’t Crawled Your Website
Google crawls the web in order to index it. Without crawling your website, Google won’t index it. So, the first reason why Google didn’t index your blog post could be you haven’t let them know about it and they don’t know to browse for it.
You can do a simple checkup by typing “site:yourdomain.com” in the search box on Google and see if it shows any pages from your site or not. If yes, then it is indexed.
But if not – read on to learn more about search engine indexing.
You have Crawl Errors
One of the common ways a page can not get indexed by Google is because it has crawl errors. Crawl errors basically mean Google couldn’t find or access your website’s content when it tries to crawl your site. Sometimes this happens because of server issues.
You can use the Crawl Errors report in Google Search Console (formerly known as Google Webmaster Tools) to know if your website has crawl errors and which pages on your site are not accessible to Google.
You are Blocking Google with robots.txt
If you have a robots.txt file, sometimes it can block Google from crawling your site. Robots.txt is used by websites to give instructions to search engine bots (like Googlebot), which crawls the web regularly, about what pages on their sites should be crawled and indexed—and what pages should not be indexed.
You see this robots.txt file in your root directory which tells the search engines where they are allowed or not allowed to crawl on that particular website through their user-agent (like Googlebot).
For example, if you upload a new homepage with all the latest posts, but don’t want it to be indexed – instead of deleting it – you can leave up the old one which doesn’t contain your latest posts and block the new one from indexing by using a robots.txt file.
Moz created a comprehensive guide to the robots.txt file where you can learn more.
Google Doesn’t Like Your Post Content
Even once Google has indexed your site, they may decide that some of its pages are either spammy or duplicates of other content which causes them not to appear in results when users search for it. If Google finds your post content irrelevant to the query, it could be causing your blog post not indexed on Google.
Here are a few reasons why Google may not like your content:
Duplicate content: While Google doesn’t penalize you for having duplicate content, it prefers to only rank one copy. If you have published your blog somewhere else first, then Google may be ranking that page instead. Make sure someone else hasn’t ripped off your content.
Check on Copyscape copy-paste one paragraph of text from your post in the search box and see if it shows up on other websites. If yes, then run through these tasks again; one by one – until you find what’s causing this duplication – and remove it!
Thin pages: Some posts might be thin pages with lots of ads or affiliate links. A page could also be seen as thin if it has little content in comparison with the number of images, videos, and other non-text-based elements on it.
Your website is New
You may have a harder time getting your blog post indexed if you have a newer website. Google won’t crawl your website as often and may need to be notified manually.
When you publish a blog post, open Google Search Console. Use the inspector tool at the top to enter the URL (web address) of your new web page.
Google will let you know if your page is in its index. If not, you can request it to be added manually.
Your Post is Not Getting Enough Links
If people don’t link to your blog post, Google doesn’t know it exists. While pitching other websites to link to your new article is a chore, you can also link to new pages from other relevant pages on your website.
Content with no inbound links is considered unimportant, or “orphaned”. If this happens, Google will likely stop crawling your page.
I use the software called Link Whisper to scan my website and give me a list of relevant pages that might be worth linking to the page I’m trying to get indexed.
Your Post Doesn’t Have a Readable Title Tag
The Title Tag is the most important HTML tag for SEO. Search engines are able to pull information from it to understand what your web page is about.
If the page you are submitting doesn’t have a readable title tag, the search engines will have a hard time figuring out what your page is about.
A title tag looks like this: <title>Page Title</title>. The words in the curly brackets are what you should change to match your post’s content.
Your Post Has Duplicate Titles on Each Page
Having the same title for every post but only changing one word might seem like an easy way of making sure Google knows its different pages, but this isn’t true. This makes it harder for the search engine to understand which posts are unique and which ones aren’t.
Go on Google and search for your title and see how many results come up. If it doesn’t show on the first two pages of results, you’re doing something wrong.
Google sees your post as spam or duplicate content and won’t let you rank higher than the original page.
You’ve Been Penalized
A website penalty could occur for a number of reasons: spammy link profile, buying links (link schemes), low-quality or thin content, cloaking, and/or sneaky redirects. If you’ve been hit with a penalty by Google for any reason, Google will take its time re-indexing your site to ensure its indexing the correct pages.
Core Vitals Are Poor
Have your page load speed and mobile friendliness been checked? If not, you should check this now.
Google will not bother to index a poor-quality site – it’s as simple as that.
Search for your website address in Google and pay attention to the results that show up. Do they look appealing? Does your page load time score well? How is your mobile experience? These two factors play a huge role in how Google ranks a web page, so make sure you’re giving them something worth ranking!
Making sure your blog post has a decent load time and is accessible from mobile devices makes it more likely to be indexed. It’s worth checking this manually before you submit your page to Google if you don’t want to rely on an automated tool like Mobile-Friendly Test or PageSpeed Insights.
Google will not bother to index a poor-quality site – it’s as simple as that.
Your User Generated Content is Spammy
Have your readership (that’s people) commented on the post? Google will not index any pages with spammy, user-generated content like comments.
It’s important to keep an eye out for this because if you receive several spammy comments or forum posts, it can take a very long time for your page to get indexed again.
Your Title Tag Doesn’t Reflect Your Content
If you’re publishing unique content but the title tag still doesn’t reflect that, Google won’t know what it is about or rank it higher than other similar pieces of content. The easiest way I’ve found to fix this is by using Yoast SEO. It tells me exactly how many characters I’m using and gives tips on how to improve my chances of ranking higher.
Your Site Has Been Hacked
If your site has been hacked, the search engine will not index any of its pages until it has fixed the problem. Hackers can use redirects on your site to make it seem like a spammy link profile is coming from your website, which usually results in an unnatural link penalty. I recommend using Sucuri to check for this kind of thing.
Google Doesn’t Understand Your Content
If you’re using useful, readable language but your content still can’t be found in Google’s index, it means the search engine doesn’t understand what your page is about.
Google has to understand who you are, and what the website is about in order to index and rank pages. The more content you create for a particular topic, the better Google will understand what your website is about.
Here are some quick tips for making your blog posts more SEO-friendly:
- Create unique content (no duplicates allowed).
- Make sure you’re using readable language that Google will understand.
- Submit your new post to Google GSC, and check to make sure it appears after a couple of weeks.
- Submit your Sitemap to Google Search Console
- Check your load speed and mobile-friendliness regularly to keep your site ranking well.
- Keep an eye out for spammy user comments or forum posts because these can slow down indexing.
- If you’ve been hacked, clean up the problem as soon as possible so Google doesn’t give your website a penalty for spammy links.
- Check your robots.txt files to be sure you aren’t blocking Google from crawling your website
- Deploy an internal linking strategy with the help of a tool like Link Whisper.
- Do a Google search for your website’s title tag and make sure it isn’t an exact match with any other pages on your site.
- Check your core vitals and make sure your page loads in 1-4 seconds.
- Use a mobile responsive theme