Confronted with one “True or False” question, an individual has a very small error opportunity space: three. There are three possible responses: true, false, or no answer. “No answer” will always be wrong, a betting man should choose one of the possible answers. But unfortunately situations where the error opportunity space is so narrow are rare. And in the real world, dealing with real problems, these spaces tend to be very large. Note that the size of the error opportunity space, EOS, makes no representations about the consequences of getting the problem right or wrong (or partially right or wrong).
When the stock market tanked on May 7th, people involved in that process had a very large EOS. A week out, experts and participants are trying to figure out what went wrong and how to limit similar incidents in the future—they are trying, in effect, to drastically reduce the error opportunity space. This is a job of product designers. By analyzing the goals of the users and the system’s constraining variables, we can come up with conceptual design, interaction design, and interface design that would address the problems that were exposed on May 7th. Coming up with a solution is not the same as actually implementing it, but it’s worth the effort to specify the parameters that would help reduce errors, either mistakes or slips, and potentially prevent malicious behavior.
Outside of the lab (or an exam), the size of EOS is determined by continuous interaction and feedback between the user (problem solver), the problem, and the environment. Steps taken by the user to solve the problem usually change the problem. In addition to user actions, the environment itself can alter the nature of the problem. And the environment can place an additional cognitive strain on the user: increase stress, cause anxiety, overload senses, impede perceptual processing, create confusion. And since most real-world situations involve multiple individuals working together (either for or against each other), the collective EOS is a product of combinations of individual ones.
While this sounds overwhelming, it’s not more complex then designing a system that minimizes EOS for air traffic controllers over US. And the way we deal with that is by strict rules, regulations, and goal alignment—all users of the system want to get planes on and off the ground fast and safely. But if the system was set up to reward speed over safety, the goals of air traffic controllers and those of the airplane passengers would get out of alignment (the passengers would still chose life over seconds-worth of improvement in flight time).
One of the system-wide problems with the stock market today is goal alignment. Computer data screens can be redesigned to highlight human and system errors, but such speed bumps would not be welcomed by traders whose goals are to make money from slight fluctuations in market prices that happen on nano-second scales (e.g. the price of a stock on the London Exchange is trading a penny more than on the New York Stock Exchange, and that’s creates an opportunity—if a trader takes advantage of a million such opportunities per hour, it adds up to real money).
Any changes to a product designed to minimize errors and EOS and support users during problem solving are called scaffolding (a term taken from construction and meaning to provide support when needed). But scaffolding has to be a welcome addition or the users simply take it down.