New URL Removal Option in Google WebmasterCentral

You know to use Robots.txt or robots meta tags when you want to exclude any of your pages from getting indexed by Search Engines. But what you have to do when you want to remove any of the already indexed ones? We used the Robots.txt or Meta Robots things to get this happen till now and as per the experiences it would have made the scene more confused to you as well as to Google. It used to take weeks or even longer to see some results for those exclusions and not guaranteed on every DCs. That’s what happened in past, now Google has announced new tools to get it done more efficiently.

The new Google Site Owner Tool in WebmasterCentral

Today when you login to your WebmasterCentral account, you must have seen a new option under the ‘Diagnostic‘ tab named ‘URL Removals‘. Please keep in mind that only if you have a verified site accounts you will be seeing this.

urlremoval_blogpost1.png (click to view the enlarged one)

What are the URL Removal Options available?

Once you click on the ‘URL Removals’ option from the left panel under Diagnostic, you will be having the following four options.

  1. Individual URLs: Choose this option if you’d like to remove a URL or image. This will work only if you had the following steps done:
    1. The URL must return a status code of either 404 or 410.
    2. The URL must be blocked by the site’s robots.txt file.
    3. The URL must be blocked by a robots meta tag.
  2. Directories: Choose this option if you’d like to remove all files and folders within a directory on your site. For this you have to make sure the exclusion of the same directory using Robots.txt file.
  3. Entire Site: Choose this option only if you want to remove your entire site from the Google index. For this you have to make sure the exclusion of the site using Robots.txt file.

Caution: Do not use this to remove your non-preferred version of your site. This means if you want your domain get indexed for its www version, don’t use this option to remove your non-www version.

  1. Cached Copies: Use when you want the page to be listed but do not want a cached copy of it. This can be done in two different ways:
    1. Meta noarchive and Removal Request: Add the meta noarchive in that page and request a URL removal using your webmaster central tool.
    2. Without Meta norachive and do a Removal request: This will make Google remove the cached version from it’s index for a minimum of next 6 months. For this to happen you must have a different version of the cached content.

Re-Inclusion of removed URLs

Suppose if you want to get any removed URLs back to the Google index you have the following options. Successfully removed URLs will be listed under the Removed Content tab and you can re-include them at any time simply by removing the robots.txt or robots meta tag block and clicking Reinclude. Also if Google finds you removed the URLs from the robots.txt file or removed the robot meta tag when they revisit your pages after 6 months, they will automatically get indexed again.

A Word from the Author:

For the complete details please do read the Official Webmaster Central Blog post about the URL Removal request feature.

Leave a Reply