Man-made Disasters in a Wake of Tsunami This month, The Fukushima Nuclear Accident Independent Investigation Commission issued its final report on the disaster: It was man-made! Here’s a quote from the report: What must be admitted — very painfully — is that this was a disaster “Made in Japan.” Its fundamental causes are to be found in the ingrained conventions of Japanese culture: our reflexive obedience; our reluctance to question authority; our devotion to ‘sticking with the program”; our groupism; and our insularity. Had other Japanese been in the shoes of those who bear responsibility for this accident, the result may well have been the same. The last sentence is particular insightful — the blame was not rested on the shoulders of a particular individual, as tempting as that might be, or even on the shoulders of some manager. The fault was places on the cultural context in which the incident played out. Museums in Paris We just got back from seeing a Tim Burton exhibit at the La Cinémathèque, in Paris. The content of the exhibit, as one could imagine, is quite wonderful. But there were many, many human failures in making the visit an enjoyable experience. And yes,…
Misapplication of Problem Solving Strategies
Misapplication of Problem Solving Strategies result when individual resort to “tried and true” strategies for solving problems, wether applicable or not.
Background Knowledge Errors, Cultural Bias, Group Decision Errors, Mental Model Traps, Misapplication of Problem Solving Strategies, Pipsqueak Articles, Product Design Strategy, Scaffolding
Knowledge, Context, & Expectation
by Olga Werby •
These are three necessary components of any product design: Knowledge: the background information that forms the foundation of product design Context: the ecosystem in which the product will be used Expectation: the alignment of goals between product creators and the users for which it was designed A failure to fully understand any of the above variables results in errors that propagate throughout the product system. But what if the product is disaster preparedness? Consider the design of an evacuation plan ahead of a disaster. You would need to understand the what kinds of damage the disaster is capable of wrecking; the probabilities for each outcome; the people and the ecosystem in which the disaster will occur; and expectations of all the participants in the evacuation plans. Tsunami and The Zoo A few years ago, I was teaching a fifth grade science class where we were discussing the possible damage from a tsunami in San Francisco (we just visited the Bay Model). The problem I posed to the students was to design a reasonable evacuation plan for The San Francisco Zoo animals. The Zoo lies on the tsunami flood plane, and as far as we knew there was no plan for…
Anchoring Errors, Attention Controls Errors, Background Knowledge Errors, Causal Net Problems, Cognitive Blindness, Diagnostic Errors, Mental Model Traps, Metaphor Mistakes, Misapplication of Problem Solving Strategies, Pipsqueak Articles, Working Memory
Information Awareness & Failure Analysis
by Olga Werby •
Given the current state of affairs in Japan’s nuclear facilities, I thought it would be good to do a quick analysis of what’s going wrong and why the officials on the ground act as they do (based on very limited information that’s trickling in via the news sources). As of today (morning of March 14th), we have two reactors that have experienced explosions, partial core meltdowns, and multiple other failures. I’ve put together data from the news with failure analysis for an alternative view of the ongoing nuclear crisis in Japan. Like many aspects of usability, FAA (Federal Aviation Administration) was the first to develop practical understanding of Information Awareness and Failure Analysis—pilots and airplane designers what to minimize errors in flight and understand failure when it happens. Like the rest of the world, I’m extremely grateful for their insight into these two aspects of systems design and usability. Below is a quick introduction to the basics. Information Awareness Information Awareness is a wonderful term that describes the state of user’s knowledge of the problem at any particular time. This means that Information Awareness changes in time and from person to person. For designers of a complex system that aims…
Anchoring Errors, Causal Net Problems, Cognitive Blindness, Conceptual Design, Cultural Bias, Cultural Differences, Errors, Group Decision Errors, Mental Model Traps, Mirroring Errors, Misapplication of Problem Solving Strategies, Pipsqueak Articles, Product Design Strategy, Scaffolding
TSA: the Good, the Bad, and the Ugly
by Olga Werby •
There has been a lot of stories lately about the Transportation Security Administration (TSA), and most have been less than flattering (to say the least). How can an agency that was designed to “serve and protect” the citizens of the United States from harm evoke such wrath from ordinarily shy and non-vocal travelers? This blog is about product design, and so my analysis of the situation will treat this as a failure of product design. Where are the failures? Mistake #1 TSA Conceptual Design: Blocking There are bad guys out there that want to do us—citizen travelers from US—harm. There are the box-cutter carrying terrorists, the shoe-bombers, the liquid explosives bandits, the underwear-bombers, the printer cartridge explosives engineers. TSA installed airport security measures that would counteract each of these threats as they revealed themselves. The basic conceptual design strategy here is blocking: identify a threat and find an effective block. This is a strategy based on hindsight: if we knew that people could sneak bombs in their underwear, then we would have had a way to block it. We didn’t know, but now we do, and so we created systems to block this threat in the future. TSA Game Plan: Escalating…
Anchoring Errors, Background Knowledge, Background Knowledge Errors, Causal Net Problems, Diagnostic Errors, Featured, Mental Model Traps, Metaphor Mistakes, Misapplication of Problem Solving Strategies, Pipsqueak Articles, Product Design Strategy, Scaffolding
What is a p-prim?
by Olga Werby •
I’ve been using the p-prim ever since I’ve learned of them, back in my graduate school days at UC Berkeley. P-prims stand for phenomenological primitives and were “invented” by Andrea diSeesa, a UC Berkeley professor in the School of Education who also happens to be a physicist (diSessa, 1983). Visit his Wikipedia page and check out some of the cool projects he’s working at now. Before I give a definition of a p-prim, I think it would be good to give a few examples. Here’s a graph published by OkTrends on beliefs of various groups (in this case as defined by their sexual orientation) about the relative size of our sun versus the Earth (our planet). Even disregarding the differences in percentages due to sexual preference, an awesome 5 % to 10 % of our population believes that the planet we live on is larger than the star it orbits. Would this qualify as a p-prim? Yes: it’s not a formally learned concept; it describes a phenomenon; it’s a bit of knowledge based on personal observations: the sun looks like a small round disk in the sky; it’s a useful problem-solving tool when one has to draw a picture with…
Autopilot Errors, Background Knowledge Errors, Conceptual Design, Diagnostic Errors, Errors, Interaction Design, Interface Design, Interruptus Errors, Misapplication of Problem Solving Strategies, Mode Errors, Perceptual Blindness, Perceptual Focus Errors, Pipsqueak Articles, Product Design Strategy, Scaffolding, Users, Working Memory
The History of Usability
by Olga Werby •
When did we start being concerned with usability? Some will say that such concern is part of being human: cavemen worked their stone tools to get them just right. Interaction design mattered even then. But the field of usability research really came into being when the tools we used started to run up against our cognitive and physical limitations. And to avoid hitting literal, as well as psychological, walls, it was the aviation engineers who started to think about usability seriously. While cars were becoming ever more sophisticated and trains ever faster, it was the airplanes that were the cause of most usability problems around WWI. Cars were big, but didn’t go very fast or had a lot of roads to travel on at the turn of the century. In the first decade of the 20th century, there were only 8,000 cars total in the U.S. traveling on 10 miles of paved roads. In 1900, there were only 96 deaths caused by the automobile accidents. Planes were more problematic. For one thing, the missing roads weren’t a problem. And a plane falling out of the sky in an urban area caused far more damage than a car ever could. Planes…
Background Knowledge, Background Knowledge Errors, Mental Model Traps, Misapplication of Problem Solving Strategies, Pipsqueak Articles
Grabbity and Other Folksy Wisdom
by Olga Werby •
We spend our lives engaged in problem solving: When should I leave the house to get to work on time? What can I make for dinner given the stuff in my refrigerator? How much work do I need to get done today in order to leave a bit earlier tomorrow? What’s the best driving route given the traffic report coming over the car radio? Can I make the this green light? Can I talk my way out of a traffic ticket? What’s the maximum amount I can pack into my trunk after a COSTCO run? How can I get that stain off the carpet? Is this blog good-enough to post? Looking over this sample list of problems, it’s easy to see that some have to do with temporal and spatial processing (e.g. packing the trunk, picking the best route, judging speed, making schedules), some with background knowledge manipulation (e.g. coming up with a recipe given a list of ingredients, looking up cleaning strategies), some with social processing (e.g. ability to analyze social situations and make correct predictions of possible outcomes—”I will get that ticket, if I run that red light.”), and some with metacognitive tasks (e.g. judging quality, comparing standards…