Boring post alert. In fact extremely boring post alert. I’ve just read this through and realise this is probably the most boring post I have written so far. I’ll leave it up so that I have a standard to measure my ennui against.
I’ve been diminished on the humour front and distracting myself by thinking about organisational issues (the subject of this post) because David has just required another feature in the software. Even though the code base was supposedly frozen. So the product release is going to be delayed a bit longer. The interface is being changed as well. Instead of creating pictures, they’re providing ambiguous text comments – because it’s so quick to dash off an email saying move that button over to the left a bit. Create a picture, fellows. Yes, it takes a little while (not long, honest) but it’s one person’s time. And it will save the other person’s time trying to decipher your comments, and your own time while you explain them in greater detail and then come over and waste their time while you show them exactly what you meant. Draw the picture – or even just print a screen shot and scribble on the damn thing.
OK. That’s the end of the software process rant. Now there is a bit of uninspired writing about the individual/system model of error.
I’ve just received a book that I ordered:”Managing the Risks of Organisational Accidents” by James Reason. James Reason is famous in the world of ergonomics and safety management, especially for the Swiss Cheese model. (Swiss cheese, in this case, referring to the sort with holes in it rather than any other type of cheese from Switzerland. If you fancy imagining it as a series of bubbles in a fondue, I won’t stop you.)
I have no specific reason to buy this book, but I wanted to re-read it, and that seemed the easiest way of doing it. (Given that for some reason it’s not available next to Ian Rankin in the local library.) I was looking for a diagram (and I can’t remember if it is in this book or a book by Sidney Dekker) about the requirement to have lots of little accidents to maintain safety. If you have a spotless record, you are then more likely to have a catastrophic accident, as you tend to more and more risky behaviour, cutting corners, because it has all been fine so far. Reason (or Dekker) describes it as those moments when you veer away from driving in the centre of your line, and are brought fully back to attention by a near miss, or the juddering as you go over cats eyes. (I’m paraphrasing here, because I haven’t found the specific paragraph yet). So it is that grab at your attention that ensures that you don’t drift off to sleep and find yourself meeting a wedding party coming the other way, with extremely unfortunate consequences for the bridal wear.
What is most intriguing about Reason’s books (I’m going from Human Error to The Human Contribution) is the movement of the balance point between individual responsibility and organisational/system responsibility. This is a non-trivial question, and goes to the heart of many people’s beliefs, politics et cetera. How much do you control your actions, and how much does your environment control you? It can go back to the nature/nurture debate, or even the predestination/free will question.
One of the things you get taught to do when considering UI design is to make it easy for someone to take a specific path. You provide them with clear signposts. Some systems (notably IKEA store design) will make it significantly more difficult to take a route that is not the one intended by the designers. Others (such as some road signage) will merely make it more obvious to take one route rather than another. If you are following the road signs to Cambridge station, and they direct you round the ring road rather than through the town, are you making an active choice to follow that route? If the signposted route means that you avoid a notorious accident blackspot, are the accidents that don’t occur a consequence of the signage? Is anyone ever aware that their life may have been saved by though an accident that didn’t happen?
What about if the signs take you through that accident blackspot?
Who is responsible for the accident that occurs. The council that accepted the poor road design, the drivers who weren’t concentrating, the person who set up the signage that sent people there straight off the motorway? People tend to get stuck on the last part of the chain of factors – like a game of “Touch last”. In fact, we are all playing Jenga. The same action that was safe when there was a complete row of blocks beneath your block is no longer safe. The action itself hasn’t changed but the circumstances round it have.
So to go back to my original position, I haven’t changed, but the circumstances surrounding me have. For example, the cat is now within strike distance.