- Feature Stories
- News Flash
- Making Noise
- The Fence
- Beyond the Cut
- Inspire Innovation
By: Jennifer Bails
Edmund Clarke surveys his new suburban home in search of a quiet corner to plug in his computer terminal. With three sons underfoot--his youngest child is just a few months old--finding a place to think and work without interruption isn't easy. He decides to set up a makeshift office in the basement, where between diaper changes and playtime he escapes to continue his research where he left off at Harvard weeks earlier. He was recruited from his Ivy League post to join Carnegie Mellon's Computer Science Department. The university even allowed him to forgo teaching for a semester so that he could get settled with his family in Pittsburgh and concentrate on his research.
Clarke's goal for the free months is to put into practice a new theory that could have a far-reaching impact on areas as diverse as healthcare, manufacturing, national security, and entertainment. Specifically, he is trying to develop an algorithm that will detect computer circuit or program errors that could lead to erroneous results.
t's early in the fall of 1982, and the computer revolution is well under way. For a quarter-century, huge and costly "mainframe" computers owned by the government and corporations have predicted the weather, processed tax returns, guided intercontinental missiles, and made space exploration possible.
But thanks to the transistor and silicon chip, computers have been reduced so dramatically in size and price that they are bleeping and blipping their way into American homes, schools, and offices. IBM introduced its first personal computer to run the MS-DOS operating system, and Epson released the first laptop. Time magazine even named the personal computer as "Man of the Year," the first non-human ever chosen, for its promise to transform the way people live and work.
For all their potential, though, computers are only as good as the imperfect designers who build them. Engineers have long searched for design errors in hardware and software by running simulations to check performance or by feeding systems with test data to see whether they behave as intended. Programmers also spend days upon days arduously poring over lines of computer code to hunt for bugs. Such hit-or-miss techniques, known as informal verification, worked reasonably well at the advent of computing.
Yet, as microchip technology advanced by the 1970s to include tens of thousands of transistors, and as software grew in complexity, it was no longer possible to simulate or test all possible behaviors of a system or manually spot every bug. Errors began to result in delays in moving new products to market, the breakdown of critical systems already in use, and expensive replacement of faulty hardware and patching of defective software.
While at Harvard, Clarke and his first graduate student, E. Allen Emerson, conceived of a new way to find and correct these quality problems. It is that approach--called Model Checking--that he brought to the basement of his home in the South Hills of Pittsburgh and then onto the Carnegie Mellon campus in his ongoing effort to make computers more reliable.
The importance of his work can't be overstated.
In 1994, a minor arithmetic bug in the Pentium microprocessor forced Intel Corp. to recall thousands of silicon chips and cost the company an estimated $500 million to fix. The Pentium bug is history's most infamous computer glitch but certainly not the only one. Software bugs are so prevalent and damaging that they cost the U.S. economy an estimated $59.5 billion a year, or about 0.6 percent of the gross domestic product, according to a National Institute of Standards and Technology 2002 study (the most recent data available). The report also found that more than a third of those costs--or nearly $22.2 billion--could be eliminated by improved testing methods to identify and correct bugs sooner and more consistently.
Hidden errors also become a matter of life and death as consumers increasingly rely on computers to control critical functions of their cars, airplanes, medical devices, security protocols, and power plants.
A computer crash on the USS Yorktown several years ago shut down all operations aboard the Navy battleship for almost three hours. Last year, a fatal software problem crashed the computers on the new U.S. stealth fighter, the F-22 Raptor, causing pilots to lose all navigation and communications when they crossed the international dateline.
For Clarke, there was no way he could have known while growing up what would become his life's calling. Born in 1945--the same year as the computer was invented--he still has remnants of a Southern accent and graciousness that derived from his upbringing in the rural town of Smithfield, Va. The son of a nurse and a salesman, he became the first in his working-class family to graduate from college. "My background is very humble," he says.(Continued …)