The language used to describe websites, HTML, has been heavily abused over the last decade and a half. The people developing the first websites made plenty of mistakes, and this resulted in mixed results in different browsers.

Poorly coded sites wouldn’t just look ‘odd’ in early browsers. Internet Explorer versions 1 through 4 along with early versions of Netscape would routinely crash if something wasn’t right.

The natural law of selection took hold – browsers were considered ‘good’ if they worked with more sites despite it being the sites themselves that were broken. The nearest any sites came to quality control was a ‘best viewed in Internet Explorer’ logo.

Gradually, browsers became better at turning rubbish HTML into viewable sites, without crashing. This gave Web designers the green light to become ever more lax about quality control. It all became a vicious circle.

Enter the standards

Actually, the Web has always had standards, but for most web developers it takes an ‘ah-ha’ moment before the proverbial light bulb starts to glow.

For me, it was a particularly tricky site I was developing back in 2003 that simply wouldn’t work across different browsers. It took a while to get things to work – it’s not a particularly steep learning curve, but we were using early .net controls – and thanks to Microsoft, these were not standards compliant in the first place!

The trouble is, to the naked eye, a standards compliant site looks pretty similar to a non-compliant site. You can spot the difference by checking out the W3 consortium’s validator, but it’s not something you’ll check every day. And therein lies the problem.

Enter the excuses

We hear these excuses all the time – often from ‘ex-’- or ‘soon to be ex’- web designers.

  • “No-one will ever know”
  • “There are plenty of other sites which aren’t compliant…”
  • “The BBC website isn’t standards compliant”
  • “It doesn’t matter”
  • “It’s not important”

When we first heard these excuses, the temptation was to try to defend our way of thinking for its own sake. Then the penny dropped – we’ve all heard these kinds of excuses before.

These excuses are exactly the same ones used by the dodgy car mechanic or plumber on the TV exposé. This is the dodgy builder trying to justify why he didn’t do something important to the master craftsman at the end of the episode. This is amateur-think.

Web design as a craft

We’re not alone in designing sites which tick all the boxes, and certainly not alone in our thinking. According to Twitterer Jeffrey Zeldman, a client who saves $5,000 buying cut-rate non-semantic HTML will later spend $25,000 on SEO consultants to compensate.

There are, however, no laws on how websites should be built – no rules and regulations – and there are plenty of freeware applications that can turn John Doe into a fully-qualified Web designer overnight.

Standards are there to separate the wheat from the chaff – the people that genuinely care about doing good work for their clients vs the people who, frankly, couldn’t care less. They are what define the true craftsman.

We know that the BBC website doesn’t comply – it’s a vast site, contains masses of legacy technology, and frankly they probably don’t have an SEO or browser compatibility problem. St Paul’s Cathedral probably doesn’t have a damp-proof course, but that doesn’t mean the new extension on your home shouldn’t have one.

The whole world is run on standards, and although it’s a free world and Web designers are free to choose their own path – those that do their best to follow standards are not creating problems further down the line. And, of course, we will all know that you’re trying to do a good job.