Given the current state of affairs in Japan’s nuclear facilities, I thought it would be good to do a quick analysis of what’s going wrong and why the officials on the ground act as they do (based on very limited information that’s trickling in via the news sources). As of today (morning of March 14th), we have two reactors that have experienced explosions, partial core meltdowns, and multiple other failures. I’ve put together data from the news with failure analysis for an alternative view of the ongoing nuclear crisis in Japan. Like many aspects of usability, FAA (Federal Aviation Administration) was the first to develop practical understanding of Information Awareness and Failure Analysis—pilots and airplane designers what to minimize errors in flight and understand failure when it happens. Like the rest of the world, I’m extremely grateful for their insight into these two aspects of systems design and usability. Below is a quick introduction to the basics. Information Awareness Information Awareness is a wonderful term that describes the state of user’s knowledge of the problem at any particular time. This means that Information Awareness changes in time and from person to person. For designers of a complex system that aims…
Causal Net Problems
Causal Net Errors arise from mixing up the result with the cause.
Anchoring Errors, Background Knowledge Errors, Causal Net Problems, Diagnostic Errors, Errors, Mental Model Traps, Pipsqueak Articles
Alternative Medicine, Placebos, and Rasputin
by Olga Werby •
In the last few weeks there has been several articles and studies published on the effectiveness of alternative medicines and placebos: “Placebos Help Patients Even Without Faking It, Scientists Say“; “Sugar Pills Help, Even When Patients are Aware of Them“; and “Alternative remedies ‘dangerous’ for kids says report“; “Doctors warn over homeopathic ‘vaccines’“. The gist of these beliefs derives from several factors: People Tend to Get Better: most of us get well over time even without medical intervention. Colds pass; flues do too. Most infections heal with time without the aid of antibiotics. Evolution have provided the human race with a great immune system. Medicine helps, we get better faster with treatment. But in most cases, we survive. So when you hear someone recommend an alternative medicine and predict that a cold will go away in three days, chances are you will get better. And over time, we the people develop p-prims (folksy wisdoms) that link health with alternative tratments. “Natural Chemicals” p-Prims: there is a strong belief among industrialized societies, at the present time, that “natural” is better for us than “artificial”. There are many sources of this belief, too many to cover in this short article. And somehow,…
Anchoring Errors, Causal Net Problems, Cognitive Blindness, Conceptual Design, Cultural Bias, Cultural Differences, Errors, Group Decision Errors, Mental Model Traps, Mirroring Errors, Misapplication of Problem Solving Strategies, Pipsqueak Articles, Product Design Strategy, Scaffolding
TSA: the Good, the Bad, and the Ugly
by Olga Werby •
There has been a lot of stories lately about the Transportation Security Administration (TSA), and most have been less than flattering (to say the least). How can an agency that was designed to “serve and protect” the citizens of the United States from harm evoke such wrath from ordinarily shy and non-vocal travelers? This blog is about product design, and so my analysis of the situation will treat this as a failure of product design. Where are the failures? Mistake #1 TSA Conceptual Design: Blocking There are bad guys out there that want to do us—citizen travelers from US—harm. There are the box-cutter carrying terrorists, the shoe-bombers, the liquid explosives bandits, the underwear-bombers, the printer cartridge explosives engineers. TSA installed airport security measures that would counteract each of these threats as they revealed themselves. The basic conceptual design strategy here is blocking: identify a threat and find an effective block. This is a strategy based on hindsight: if we knew that people could sneak bombs in their underwear, then we would have had a way to block it. We didn’t know, but now we do, and so we created systems to block this threat in the future. TSA Game Plan: Escalating…
Background Knowledge, Background Knowledge Errors, Causal Net Problems, Diagnostic Errors, Group Decision Errors, Mental Model Traps, Pipsqueak Articles, Product Design Strategy
e-Waste & Product Design
by Olga Werby •
I just came across a very interesting video by Annie Leonard. She’s been making little, approachable documentaries that explain difficult to understand issues—e-waste being one of those. Here’s her latest: The Story of Stuff. This is the story about how stuff gets designed, made, distributed, and then trashed. The Story of Bottled Water. This is the story about drinking water and the marketing of bottled water.
Anchoring Errors, Background Knowledge, Background Knowledge Errors, Causal Net Problems, Diagnostic Errors, Featured, Mental Model Traps, Metaphor Mistakes, Misapplication of Problem Solving Strategies, Pipsqueak Articles, Product Design Strategy, Scaffolding
What is a p-prim?
by Olga Werby •
I’ve been using the p-prim ever since I’ve learned of them, back in my graduate school days at UC Berkeley. P-prims stand for phenomenological primitives and were “invented” by Andrea diSeesa, a UC Berkeley professor in the School of Education who also happens to be a physicist (diSessa, 1983). Visit his Wikipedia page and check out some of the cool projects he’s working at now. Before I give a definition of a p-prim, I think it would be good to give a few examples. Here’s a graph published by OkTrends on beliefs of various groups (in this case as defined by their sexual orientation) about the relative size of our sun versus the Earth (our planet). Even disregarding the differences in percentages due to sexual preference, an awesome 5 % to 10 % of our population believes that the planet we live on is larger than the star it orbits. Would this qualify as a p-prim? Yes: it’s not a formally learned concept; it describes a phenomenon; it’s a bit of knowledge based on personal observations: the sun looks like a small round disk in the sky; it’s a useful problem-solving tool when one has to draw a picture with…
Background Knowledge Errors, Causal Net Problems, Conceptual Design, Diagnostic Errors, Featured, Interaction Design, Interface Design, Pipsqueak Articles
The Anatomy of Failure
by Olga Werby •
On May 7th 2010, at around 2:30 p.m. Eastern Time, the stock market went on a wild ride, dropping over 900 points in matter of just minutes. What happened? There’s lot’s of speculation, and some know more then they are willing to say. But what’s clear is that there was just the right confluence of world events, human and computer errors, and system-wide communication breakdown that triggered a mass sell-off of stocks at fantastic prices. In other words, there was a catastrophic failure during product interaction. I’m not an investment analyst and have limited knowledge in this subject area, but I am interested in product failure. So Thursday’s stock market episode was very interesting. Here’s a little background on the events of that day broken down into steps leading to the failure. Step 1: When the New York stock market opened on Thursday, bad news was streaming in from Europe—there were fears that Greece would ultimately default on its loans; its people were staging massive demonstrations in Athens; Euro was going down. Step 2: In our very interconnected world, this kind of news makes investors skittish and the stock market was dropping value all morning. Step 3: At around 2:45…