Narendra Dhami

My Site

16 Things to Look for in a Website Health Check

Posted by Narendra Dhami on July 1, 2008


Seeing as I work for a company that holds no.1 rankings for some of the most competitive keywords in the world and drives millions of search engine visitors to our clients on a monthly basis, I thought it was time I did some more SEO related posts.

This is mostly my personal criteria, but I wanted to share the things I look for when doing an SEO health check for others, and hopefully you can find this a useful resource and be able to improve your own search engine traffic by going through the checks.

On-Site Website Health Checks
1 Non-WWW to WWW Redirect – What this is, is a redirect of your domain from either the non-www version (http://viperchill.com) to the www version (http://www.viperchill.com) or vica versa. Not only is this good for usability reasons but it means any links to your site can pass on full value to the domain you want to use. For tips on how to do this check out this article at Stepforth.

If this is not in place, you could be missing out on some link juice and have a form of duplication issue with your website known as canonicalisation. You can also choose your preferred version in Google Webmaster Tools if you have that set-up for your website.

2 Page Titles – All page titles on your site should be unique, they are thought of as the most important on-site SEO factor to help with rankings for website pages. By unique, this doesn’t mean put anything in them. Include keywords which are relevant to the page and also make sure the titles are attractive enough for people to click through from search engine results.

Try to have the main keywords at the start of the title and as little branding as possible.

3 Multiple Homepages – This is a problem I see time and time again when doing health checks, companies tend to have multiple versions of their homepage. When it’s not an issue with the www re-direct it’s the logo that links to the homepage ending with a /index.php URL. Check that there are not multiple versions of this by seeing where all ‘Home’ links go too.

If there are multiple versions just 301 (permanent) redirect the file back to the domain route.

4 Duplicate Content – This is a very common issue with many websites and can be tricky to deal with. On a ridiculous scale, I’ve seen over 100 well established sites with the same article on their webstie and of course only one of them is ranking highly for the information. Another example I’ve seen is a property site that had a different contact form for all of it’s hundreds of listings which meant some serious duplication throughout the site.

Although the effect of duplicate content is debatable, simply having the possibility of any penalisation or even just those pages removed from the index is bad enough. This is even more likely to happen if you are taking content from other sites and putting it on your own.

5 Linking Structure / Sitemap – It is important (especially for large websites) to have a good linking structure within a site to help all pages get indexed and make sure that the search engine spiders can find them. Sitemaps are not only good for usability reasons for your users, but it is a great way to show off a big portion of your content.

Of course, with a site that has thousands of pages this isn’t as easy as a 10 page setup so make sure pages are linked too using things like breadcrumbs, footer navigation and in-content links.

6 Robots.txt – Check if the site has a Robots.txt file and see if it is blocking any pages. There are probably a few pages which don’t need to be indexed and are affecting the link-juice of the website. Once you have established these pages upload them in a robots.txt (text file) to the root of the site by following these tips.

When checking this for a lot of sites, the main issues were that they either didn’t have this file or they were actually blocking pages they should probably want indexed.

7 Image Optimisation – Another common issue that most sites could make use of (including this one) is better optimisation for images. This site gets hundreds of visitors per month just from Google images and ranking highly for some website logos. Give images a descriptive file name and also an alt tag so people who don’t load images know what the graphic is about.

Many sites fail to optimise their images in anyway so this is one change you can make that, in time, will have a very positive effect.

8 Broken Links – A ex-colleague of mine used to say that ‘Xenu is my Homeboy’, basically Xenu Link Sleuth is a great site checking tool that lets you see if pages of a site are linking to pages that no longer exist. If you are doing this on a very big site it can take hours but it is definitely worth it.

Broken links mean you are passing PR to pages that don’t exist and are bad for usability in terms of your website visitors clicking on links that lead to nowhere. This tool is also great for finding duplicate titles and descriptions.

9 Redundant / Minimal Pages – This can be very typical for ecommerce websites and is something I always like to minimise for clients. What this means is basically pages that have very little content or at least very little content that is unique / original.

This can include contact pages, product pages on ecommerce sites or any other pages. If you are using a template that is text-heavy then your pages aren’t going to be very unique. You can either block these from search engines if you don’t need them to rank or you can beef up the content a bit.

10 Meta Information – Meta information involves the description and keywords tag that you can find in the header for a website. These aren’t as important in terms of SEO like they used to be in the past but they should definitely be unique where possible.

Remember that like the title tags, the meta description is usually the snippet shown when you appear in search engine results so you want this to be relevant and non-spammy. In terms of keywords I would use no more than 10 per page.

11 Text / Image Links – A large newspaper that we work with came to us when their whole navigation was in images; It didn’t take much for them to change this to text and keep the same look as the original images. In my view, although I don’t have evidence, the text links pass more value to pages rather than an image link and of course you can increase your internal anchor text.

If this isn’t an option, you can add the links to the footer of the site and no-follow the image links in the header to get a similar effect.

12 H1, H2 Tags – Make sure that the website is making proper use of H1 and H2 tags on the site. H1 for the main heading of the page which will be similar to the title and H2 for subheadings. These can still be styled to look ‘normal’ so the text doesn’t have to be large.

The boost this gives might not be huge but I do believe there is a boost and helps search engines determine which words on the page are important and what it should be ranking for.

13 404 Page – Check that 404 pages (pages that don’t exist) show links to other areas of the site and even include a search box if possible. This is once again not only good for usability as it keeps people on the site but gives search engine spiders a place to go once they have landed up on one of these pages.

SEOmoz found a good example of this to be from Apple’s 404 pages

14 Page URLs – A lot of sites I’ve worked with have some terrible URL’s. I’ve seen things from 20 “0’s” in a row to multiple ID paramaters. URL’s in my opinion should help people see what a page is about i.e. including relevant keywords and also be ‘clean’ so as to not include a lot of junk in there.

Most CMS’ aren’t search engine friendly when it comes to URL’s but there is probably a way to rewrite these without much difficulty. If you are using a Linux server, take a look at this great guide on Mod Rewrite.

15 Using Nofollow – In order to protect the PR and link-juice of a site, I recommend sites use the no-follow tag for linking to the likes of:

* RSS Feeds
* Other sites that might not be trustworthy
* Pages on their site that search engines don’t really need to see (Contact / TOS / Abuse pages)

You can also recommend some advanced nofollow techniques such as no-following the ‘Home’ page link and in the footer linking to your homepage with the anchor text you are trying to rank for, which increases internal anchor text. Personally, I’ve found this to be very effective.

16 Source Code – Although this is pretty obvious, there may be some surprises you find with a site just from really checking the website source code. For example, one site I’ve worked with had a robots instruction to only come back every two days which we made sure they removed (they wrote about 50 articles per day). There also may be other issues such as hidden text or links and unnecessary code that is slowing down the site.

In Summary
There are probably quite a few more checks people have when going through their own sites or working with clients but I think I’ve covered the main and most important ones here. I recommend that everyone add their site to Google Webmaster Tools as you can get some great information on what Google has found about your site and even find out which keywords you are ranking for.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

 
%d bloggers like this: