You know to use Robots.txt or robots meta tags when you want to exclude any of your pages from getting indexed by Search Engines. But what you have to do when you want to remove any of the already indexed ones? We used the Robots.txt or Meta Robots things to get this happen till now and as per the experiences it would have made the scene more confused to you as well as to Google. It used to take weeks or even longer to see some results for those exclusions and not guaranteed on every DCs. That’s what happened in past, now Google has announced new tools to get it done more efficiently.
The new Google Site Owner Tool in WebmasterCentral
Today when you login to your WebmasterCentral account, you must have seen a new option under the ‘Diagnostic‘ tab named ‘URL Removals‘. Please keep in mind that only if you have a verified site accounts you will be seeing this.
What are the URL Removal Options available?
Once you click on the ‘URL Removals’ option from the left panel under Diagnostic, you will be having the following four options.
- Individual URLs: Choose this option if you’d like to remove a URL or image. This will work only if you had the following steps done:
- The URL must return a status code of either 404 or 410.
- The URL must be blocked by the site’s robots.txt file.
- The URL must be blocked by a robots meta tag.
- Directories: Choose this option if you’d like to remove all files and folders within a directory on your site. For this you have to make sure the exclusion of the same directory using Robots.txt file.
- Entire Site: Choose this option only if you want to remove your entire site from the Google index. For this you have to make sure the exclusion of the site using Robots.txt file.
Caution: Do not use this to remove your non-preferred version of your site. This means if you want your domain get indexed for its www version, don’t use this option to remove your non-www version.
- Cached Copies: Use when you want the page to be listed but do not want a cached copy of it. This can be done in two different ways:
- Meta noarchive and Removal Request: Add the meta noarchive in that page and request a URL removal using your webmaster central tool.
- Without Meta norachive and do a Removal request: This will make Google remove the cached version from it’s index for a minimum of next 6 months. For this to happen you must have a different version of the cached content.
Re-Inclusion of removed URLs
Suppose if you want to get any removed URLs back to the Google index you have the following options. Successfully removed URLs will be listed under the Removed Content tab and you can re-include them at any time simply by removing the robots.txt or robots meta tag block and clicking Reinclude. Also if Google finds you removed the URLs from the robots.txt file or removed the robot meta tag when they revisit your pages after 6 months, they will automatically get indexed again.
A Word from the Author:
For the complete details please do read the Official Webmaster Central Blog post about the URL Removal request feature.