One of the areas of discussion at the NIH Citizen Science Engagement Think Tank meeting last month was how to categorize the roles (and thus rules of engagement) for citizen scientists. There was a continuous pressure to call individuals who “donate” their medical data to scientific research patients. Let me start by saying that I find that unacceptable — aside from the fact that every human being on Earth has been or will be a patient at some point in their lives; the label patient implies a lower level on the hierarchy than doctor or scientist. The whole point of citizen science initiative is to break down the barriers to entry — we are ALL scientists! Being a scientist is not measured by the number of years in school or diplomas on the wall. It is the willingness to do science that is key. Thus we can all be scientists. With that said, what follows is the discussion on group dynamics — how do people work in groups and how we can support productive scientific endeavors through good design and social engineering. Think Different Collective Groups of people are not made up of homogeneous people — we are all idiosyncratically…
Tag Archive for Citizen Science
Conceptual Design, Cultural Differences, Ethnographic & User Data, Pipsqueak Articles, Product Design Strategy, Scaffolding, Users
Intended and Unintended Consequences of Social Design
by Olga Werby •
Nudging is a form of social engineering — a way of designing system constraints and support structures to encourage the majority of people to behave in accordance with your plan. Here’s a famous-in-my-classroom example of nudging: Opt-in versus Opt-out Consent Solutions There are many examples of such social engineering. During our breakout groups at the NIH think tank on the future of citizen participation in biomedical research, I raised the difference between opt-in versus opt-out option results for organ donation. In some countries in Europe, citizens have to opt-out from donating their organs in a case of a tragic accident — they have to do something to NOT donate their organs. As the result in Austria — which has an opt-out system — the donation rate is 99.98%! While in Germany — which has an opt-in system — only 12% will their organs for transplants. This is a huge difference in consent between very similar populations of people. Unintended Consequences of Social Design Not all social engineering efforts go as well as opt-in/opt-out organ donation systems. To reduce pollution for the 2008 Summer Olympic Games in Beijing, the Chinese government established the even/odd license plate law: cars with even license…
Conceptual Design, Pipsqueak Articles, Product Design Strategy
Long-term Strategy versus Fast Success
by Olga Werby •
NIH think tank on the future of citizen participation in biomedical research came to a closure on Friday night, and I had many hours in the airport and plane to think about all that was discussed. In the next few days, rather than writing a longish piece of my impressions of the meeting, I hope to get to each of the items that I feel I didn’t get a chance to fully explore while in Washington D.C. in a series of small posts. Low Hanging Fruit There is a strong temptation in any project to achieve success early (and often). The expression Low Hanging Fruit refers to relatively easy to accomplish tasks. But in the desire to get things done, it is easy to lose track of the overarching strategy — the main purpose of the enterprise. By chasing the Low Hanging Fruit, it is easy to get distracted and end up on the wrong path. Two Different Roads We’ve discussed two visions for the future: more of the same and a radical cultural shift. We visualized the first path as “turning the knob to 11” (aka Spinal Tap). More of the same (but with higher intensity) has many tempting…
Conceptual Design, Featured, Pipsqueak Articles, Product Design Strategy
2013 Think Tank Presentation on Socio-Technical System Design
by Olga Werby •
I’m about to leave for Washington D.C. for a Think Tank on Citizen Engagement in Biomedical Research. I have only five minutes to talk during the introductory speed geeking event, where all of us get to know about each other and each other’s projects. I’m going there to talk about our lessons learned from designing complex socio-technical systems that required intense participation from their users. I’ve been working on designing such systems for many years now. Some projects were/are very successful, some not so much. I’m not sure I will be able to give a full account of what we’ve learned, so I’m putting up a long(ish) version of my presentation here — if I had 15 minutes, this is what I would say to our very interesting group of participants. I chose these four complex socio-technical systems because all of them were in some measure educational ventures and all required outside users to contribute large amounts of data. I will start with Ushahidi. Ushahidi was born during the 2007 Kenyan election. That election was bloody and the violence, in many cases perpetrated by the government, was not being reported. Ushahidi was a grass-roots effort to tell their countrymen and…