Glass, Robert L. Computing Calamities: Lessons Learned from Products, Projects, and Companies That Failed. Upper Saddle River, NJ: Prentice Hall, 1998.
"While there are many reasons for taking an interest in computing failures, one of the most compelling in my mind is that failure is an excellent teacher. In the computer industry, however, the tradition of learning from failure is not as well established [as in other disciplines]."
|
This book documents numerous stories of failure from around the computer industry, providing some insightful post-mortem critiques and suggesting what lessons can be learned from the mistakes of others.
He begins with a list of some prominent examples:
- In the mid-1980s, a software bug caused a medical device to deliver overdoses of radiation to six cancer patients, killing three of them.
- Between 1986 and 1997, more than 450 reports have been filed with the U.S. Food and Drug Administration concerning software defects in medical devices.
- During the Gulf War in 1991, a software bug in the targeting software of a Patriot missile allowed an Iraqi Scud missile to hit the barracks of American servicemen, killing 29 Americans.
- In 1996, General Motors recalled almost 300,000 cars because of a software problem that could cause engine fires.
- In 1998, a satellite that handles most of the pager traffic in the U.S. stopped working when both the primary and backup computer failed.
- Numerous projects, such as that undertaken by the California Department of Motor Vehicles in the late 1980s and early 1990s, have run up multi-million-dollar bills only to be abandoned.
Glass points out that "there is no single failure pattern, but instead a complex array of forces that come together." The rest of the book provides stories that demonstrate the importance of the numerous factors, including: personality differences, bad business models, unrealistic expectations, culture clashes, ineffective marketing, technical mistakes, insufficient industry support and the failure to understand the needs of potential users.
Implications:
Though the primary focus of Glass book is the private sector, there are still many insights of potential relevance to the Kellogg foundation and the various communities it serves:
- All failure should be treated as opportunities to learn. Investment in ICTs always involves risk. Rather than denying this fact, those involved with failed projects should take great pains to identify what worked and what did not, and what the implications should be for any future investment in similar efforts.
- Sunk costs and organizational inertia are dangerous. Many of the projects documented in this book were clearly leading in the wrong direction quite early on. "Almost any outsider could walk into such a situation, and in the words of one of my colleagues, see the 'moose on the table,' but those inside the organization would continue as though the moose didn't exist." All ICT projects should involve mechanisms for regular reevaluation and change of plans when necessary. Just as important, but even more difficult to accept psychologically and politically, should be the recognition before even beginning an ICT projects it may reach a point at which it will need to be cancelled.
- Avoid "scope creep." If the scope of a project is allowed to grow uncontrollably, then it will attempt to do so much that failure is inevitable. Being open and clear about expectations can prevent frustration down the road.
- Proposal should always be evaluated in terms of some identifiable user need(s). Many of the business failures in the book could be attributed, at least partially, to system-centered rather than user-centered thinking. Just because ICT can be designed to do something does not mean that people will be lining up to use it or that if they do use it, it will actually meet their needs.
- Do not get paralyzed by the possibility of failure. "If you're not failing, you're not serious about new ideas."
|
» Next: Heintz-Knowles, Katharine E.