It is not always easy to identify the reasons behind poor website performance, as it requires a thorough website analysis based on various factors. An SEO expert can carry out this analysis for a fee, but there are some things that you can check yourself using this handy checklist.
I am not claiming this as an ultimate checklist. But if you follow this, you may able to identify at least a couple of errors which might be causing poor performance in search engines. The checklist will also suggest solutions for each of the errors.
Here you go.
1. Canonical URLs:
Open Google and do two searches ‘site:example.com’ and ‘site:www.example.com’. If you see different results for each search, you will need to correct this. The fix will make sure your pages are indexed correctly under a single URL, and will direct all your backlinks to the same point.
The fix requires coding and server management skills; if you do not possess these skills enlist some technical support.
2. Mixed Case or Special Characters in URLs
Avoid naming your files and folders using both upper and lower case – e.g. www.example.com/Category/Product_One.html – or including special characters – e.g ‘ or a space. These practices are not recommended and could cause unnoticed errors in your server logs.
People normally use lower case when typing URLs or performing searches in a browser. Remember Apache is case sensitive while IIS server is not. In either case using both upper and lower case in URLs may cause 404 pages or a duplicate content threat. Neither is recommended for SEO.
If you are using mixed case URLs, change them to lower case. Ensure you redirect (301) any requests for the previous mixed case URLs to their corresponding new ones. The special characters also need to be removed and redirected to new URLs.
3. Session IDs or Dynamic URLs
Session IDs or Dynamic URLs cause a duplicate content threat, as well as incorrect page indexing in search engines. Implementing this fix is a bit technical; therefore it may be useful to enlist some help from an expert.
Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page. – Google Webmaster Guidelines
If the website is hosted in Apache you can use .htaccess file to rewrite your URLs.
For IIS server, Scott Guthrie of Microsoft describes various techniques for URL rewriting in .NET.
4. Broken Links:
Broken links are indication of a poorly managed website and contribute to low rankings in search engine results.
To track the broken links, you can use a tool named Xenu. Download and install Xenu link sleuth onto your machine. This guide explains you how to use the Xenu to create a broken link report. Get a list of pages which returns a 404 error header status code on request and the page contains the live link.
Ask a web developer to help fix the broken links if you cannot do this yourself.
5. Improper 302 Redirects
Using Xenu link sleuth tool you can also trace a list of redirected pages within your website. Trace out any 302 temporary redirects implemented on your website. Temporary redirects may cause a duplicate content threat reducing your content’s uniqueness.
Make sure your current 302 redirects are genuine, if they are not they need to be changed to permanent (301) redirects.
6. 404 page requests
A large number of 404 page requests could be an indication that something is wrong. It could be that URLs have been removed from the server without having proper redirects in place to their current corresponding pages.
Once you have the list, redirect (301 permanent) them to corresponding new pages.
7. Indexable Content
No matter how much time, money and effort you have spent on your website design, if it hides important content from search engines then it has been wasted. Flash or any other Rich Internet Applications can unfortunately do the same. Search Engines like textual content and you should ensure your website content is search engine friendly.
Install the Yellowpie Lynx viewer plugin on your browser. You can then right click on any of your pages, by choosing the Yellowpipe Lynx Viewer tool option; you can see exactly how a text browser displays that page. Search engines see your page just as a text based browser sees it.
If you see your page content is not search engine friendly, fix it.
8. Duplicate Title and Meta Description
Open Google.com and do a search site:yourdomain.com and scan through the Titles listed. Are they identical? This is another common mistake which causes poor rankings in Google.
Ensure your website pages have unique Title and Meta description tags relevant to the page containing main targeted keywords.. Tips to write good Title tags.
9. Duplicate Content
Sometimes your content gets scraped by another website and then out ranks you in search results. Sometimes you have duplicate content within your own website. You need to ensure your website content is unique.
The CopyScape will find any duplicate pages that may exist on the web; you can then take the necessary copyright actions to get it removed. The free version allows you to search 20 queries per month. Another way to find duplicate content is to randomly take a section of content from your website and google it within ” “. This will give you a list of pages that contain the same content.
Duplicate content within your own website needs to be rewritten.
10. Image Links Missing Alt Texts
You may be aware that search engines cannot read the content inside an image. If possible, try to use text instead of images to display important names, content, or links. Still, you may be forced to use images for your navigation to keep your template rich.
Always use proper alt texts for your images, especially when they are used as links. Don’t stuff your entire keywords list to it, just a relevant description with the main targeted keyword for the destination page.
e.g <a href = "/seo-services.php"> <img src="/images/seo-services.jpg" alt="Professional SEO Services"/> </a>
11. Typo / Spelling Mistakes
Typos or poorly written content within a website is not recommended for either search engines or visitors.
Don’t forget to spell check! There are tools available, paid for and free, some require downloading and are some web based. Other options are to hire a content writer or ask someone to proof read your content.
A Word from the Author
If you like this post, you may also like the upcoming ones. Feel free to subscribe to the feeds or bookmark it at your preferred location. Please use the comment part to discuss on this very post, your thoughts are always welcomed here.