by Flemming Funch
Peter Coffee has a nice article in eWeek, "Spreadsheets: 25 Years in a Cell", about how our use of certain software tools shape our behavior, and how different tools set us up for different assumptions and different fallacies and different ways of wasting our time.
Some companies have forbidden Powerpoint presentations in meetings. Because often people use a considerable amount of time at making their presentation look really impressive with graphical effects, but the time is often not very productive. It takes them 10 times as long to say the same thing, and it might just make it less clear what they're actually saying. A simple set of bulletpoints is often more clear.
And then there are spreadsheets:"There are two ways that spreadsheets, as we know them, distort our thinking and lead to bad decisions. The first distortion is the use of point values and simple arithmetic instead of probability distributions and statistical measures. So far as I know, there's no off-the-shelf spreadsheet product—certainly none in common use—that provides for input of numbers as uncertain quantities, even though almost all of our decisions rest on forecasts or on speculations.
There are add-on products that incorporate uncertainty into spreadsheets, and many of them are quite good. Products of this kind that I've favorably reviewed over the years include DecisionTools Pro from Palisade and Crystal Ball Professional and CB Predictor from Decisioneering. It's not too hard to appreciate the difference between products that incorporate uncertainty and those that don't: On the one hand, you've got, "We predict a $1 million profit in the first year"; on the other, "The expected Year 1 profit is $1 million, but there's a 30 percent chance of losses for the first two years." These different statements will lead to quite different discussions." We assume that because we can put some numbers in a spreadsheet that they somehow become more real and certain. And since the spreadsheet programs typically have no good way of representing the actual uncertainty, we skip over the subject. And it isn't enough that our tool allows us to represent several different alternative scenarios:"The subjects whose tools invited them to imagine alternative scenarios believed they were doing a better job—even though statistical measures of their results showed no improvement in the actual quality of the forecasts. Those subjects did, however, take longer to perform the task. Isn't that the worst nightmare of those who must justify IT's return on investment—spending extra money on a more time-consuming product that yields absolutely no measurable improvement?" It is a bit of an embarrassing secret that many of our computerized tools simply allow us to waste more time on making it look like we know what we're talking about, covering up the fact that we really don't.
We can get better tools, of course. Tools that better represent uncertainties and that show our assumptions more clearly. That's certainly an avenue to pursue. But we also need to develop our own built-in bullshit detectors, so that we can stay more conscious of the assumptions and fallacies inherent in what we're looking at, no matter how pretty and scientific it looks.
|
|