Matthew J. Kellett

Website Architect and Developer

Website Coding Standards

With so many web development agencies emerging today I found myself doing some research in to how a few of these agencies (and freelancers) carried out the art of website production and how they manage differentiate themselves in such a large crowd.

So after a couple of hours running Google searches I came up with a list of agencies and freelancers that appeared to stand out on merit alone, that is without the assistance of Google Adwords.

As I slowly began looking through each of the agency portfolios, a couple of things struck me:

  • All of the sites were very graphical
  • Very few of them contained no coding errors at all (7 out of the 50 I checked)

Now you may wonder why I decided to look into this ... well the answer is simple, I have been looking back through some of hte articles I have previously written and it got me wondering about the current state of things on the internet.

I have always believed that all websites should be accessible to anyone who wants to view it but more recently I have noticed that the standards being applied to site these days tends to be sliping with more and more sites making use of Ajax and Javascript with little regard to those users without it. I know all modern day browsers come with JavaScript enabled these days but the fact remains there is an ever growing capacity of people who cannot use conventional browsers for one reason or another.

By far the easiest way to ensure that sites are accessible to at least the majority is to build your site with one of the W3C standards in mind as well as testing it without JavaScript to see how well it functions.

Based on this research I found that it appears to be that almost 70% of sites built by these design agencies contained at least one major coding error and 10-15 warnings.

Now I realise this quite a bold statement to make considering the shear number of agencies and freelancers out there and I am expecting a few disagreements to be made but there are always exceptions to these findings and this I am aware of .

This research has lead me to believe that has become more important to get sites out of the door as quick as possible regardless of whether the customer believes they are getting the quality they should be expecting.

On the back of these findings I decided to ask myself the following the question:

How hard it is these days to create a semantically valid website?

Given the toolsets that most developers tend to have in their arsenal these days and based on my own experiences (what else is there?) is it really that hard to produce semantically valid code. Given that most developers will probably use Firefox (I prefer Opera personally but thats a discussion for another time) and make use of a wide range of addons that are readily available to assist developers.

The following is a small list of the main ones I use myself for debugging and ensuring all my pages are semantically valid:

  • Firebug / Dragonfly (Opera) - Probably the most useful for debugging HTML and CSS
  • Web Developer Toolbar - A toolbar which provides plenty of tools for ensuring your sites are accessible
  • HTML Validator - Puts a status flag in the browser to tell you whether the page you are looking at is semantically valid based on the doctype and content of the page
  • YSlow - A useful addon for Firebug which gives your pages a performance rating

Simply by checking a couple of these tools before you release a site, it is very easy to tell whether your page conforms to one of the W3C standards. So the next time you are about to release a site, endeavour to make it pass and together we can improve the quality of the internet little by little


There are no comments for this article