If Excel were a footballer it could be forgiven for running around and pulling its shirt up ostentatiously to reveal the slogan “Why always me?” It now seems to be the software application of choice whenever a scapegoat is required for a significant financial error. In view of that, I am amazed at how few stories do emerge about critical errors in Excel, or any other spreadsheet, says FSN writer and spreadsheet guru Simon Hurst.
I’m certainly not trying to claim that errors in spreadsheets don’t exist or that they are not significant, but when you consider the general prevalence of Excel, its use for complex and business critical financial projects and its inherent lack of structure and control, it surprises me that they don’t surface more often. Anyone that doubts the propensity of spreadsheets to be implicated in career-wrecking errors needs only to look at the website of the European Spreadsheet Interest Group (EuSpRIG) and, in particular, its horror stories.
Admittedly, in many of the cases reported, it is Excel that stands over the victim, still-hot smoking gun in hand, but often Excel is little more than an innocent bystander, suffering the rap for something that could well have happened with or without its involvement. I’ve not seen a European Word Processing Risk Interest Group (EWPRIG?), nor do I remember anyone blaming Microsoft Word for the critical absence of the word “not” in a key legal contract. So, why always Excel?
I’ve already alluded to some of the reasons why Excel plays a part in a variety of financial disasters. The ratio of hours spent by finance personnel immersed in Excel compared to any other single application is probably huge so, just on that basis, we should expect Excel to be responsible for more errors than anything else. The nature of the projects that Excel is used for should also make the situation far worse. By its nature it is likely to be used for less well-structured and less controlled tasks than ‘normal’ applications. Added to that, software applications can cope with the straightforward calculations and reports themselves – it’s the more difficult areas that Excel is usually called upon to sort out.
The roots of Excel’s weakness are firmly entwined in its greatest strength: its lack of structure. We use Excel because we need to escape the confines of structured, controlled, ‘proper’ applications. We bemoan Excel’s lack of structure when it aids and abets us in making errors.
There is a key paradox at the heart of the Excel discussion. The academics tell us, and they have persuasive evidence to justify their claims, that around 90% of Excel spreadsheets, of a reasonable degree of complexity, contain not just errors, but ‘significant’ errors. If this translates into 90% of decisions underpinned by spreadsheets being wrong, then surely Darwinian evolution would have rendered Excel-using companies extinct or, at the very least, reduced the number of Excel-using personnel. I’m not sure I see much evidence of this decline – perhaps everyone is up the same evolutionary cul-de-sac. One the other hand, if 90% of spreadsheets are significantly wrong but there are far fewer spreadsheet-influenced wrong decisions, what is the mechanism that turns wrong spreadsheets into right decisions?
Unlike the academics, I have nothing except anecdotal evidence to support my view, but the most likely explanation seems to be that spreadsheets are playing a supporting role to professional judgement, with that judgement overriding all those Excel errors. This would help explain why many Excel disasters occur when the human-spreadsheet bond is broken – perhaps by holiday or sickness.
This certainly doesn’t mean we should stop worrying about Excel or ignore the valid warnings from the likes of EuSpRIG. There are still more than enough opportunities around for Excel to ruin your career or wreck your company. But perhaps we need to refocus our attention. EuSpRIG have been warning us for years about the dangers without, it appears, making the crucial breakthrough in changing either how much Excel is used or to the percentage of incorrect spreadsheets. It seems that the 90% error message is not being heeded, possibly because the ‘Excel paradox’ leads people to believe that the analysis must be incorrect, or is in some way not relevant to them. Where people might not recognise the 90% error issue, most of them will recognise the amount of time that Excel consumes in their organisation. Many will go further and recognise that an element of that time is wasted either in rectifying problems that could have been avoided with better practices and standards, or through spreadsheets being used inefficiently, or through using spreadsheets when more suitable applications or utilities are available.
Rather than refining the accuracy of the ‘90% of spreadsheets are wrong’ figure, I’d like to see some valid research into how much time spreadsheets waste, just through inefficiency rather than through catastrophic error. Is anyone aware of any such research?