Revisiting Sheila Jasanoff’s Technologies of Humility
In 2003, Harvard University’s Sheila Jasanoff wrote about what she termed “Technologies of Humility.” Recognizing the growing disconnect between technological progress and its effective governance, Jasanoff explored new approaches to decision-making that “seek to integrate the ‘can-do’ orientation of science and engineering with the ‘should-do’ questions of ethical and political analysis.” Five years on, her (still radical) ideas resonate deeply with the science and technology ambitions of the incoming Obama administration.
Sitting down this morning, I had intended to write about three papers recently published on-line in the journal Nature Nanotechnology. The papers (by Kahan et al., Pidgeon et al. and Sheufele et al.)—which were widely reported on a few weeks back—consider factors influencing “public” responses to nanotechnology, and challenge long-held beliefs that knowledge leads to acceptance.
However, I became distracted! Searching for an original frame for these studies, I returned to Jasanoff’s 2003 paper “Technologies of Humility: Citizen participation in governing Science,” published in the journal Minerva (Minerva 41:223-244). Reading it, I was struck afresh by how germane Jasanoff’s ideas are, how completely they seemed to have been ignored in US policy making, and how important they are to the science and technology agenda of the incoming Obama administration.
Rather than read a re-hash from me of what is an eloquently written and very accessible paper, I would strongly recommend you pour yourself a glass of good wine (a cup of coffee or fine tea will do just as well), carve out some quality time, and read the original—which is downloadable from here [PDF, 120 KB]. It is after all the holiday season, and what better than a good read to fill the long hours before the grind of work begins once again!
But just in case you are in a hurry and care to put up with my crude and flawed overview, here you are:
Jasanoff starts out:
“Long before the terrorist atrocities of 11 September 2001 in New York, Washington, DC, and Pennsylvania, the anthrax attacks through the US mail, and the US-led wars in Afghanistan and Iraq, signs were mounting that America’s ability to create and operate vast technological systems had outrun her capacity for prediction and control.”
Looking back over 20 years of “ ‘normal accidents’, which were strung like dark beads through the latter years of the twentieth century and beyond” Jasanoff notes that
“Scientiﬁc and technical advances bring unquestioned beneﬁts, but they also generate new uncertainties and failures, with the result that doubt continually undermines knowledge, and unforeseen consequences confound faith in progress.”
This opens up a discussion on risk, which Jasanoff argues, is not “a matter of simple probabilities, to be rationally calculated by experts and avoided in accordance with the cold arithmetic of cost-benefit analysis,” but rather is part of the human condition, and “woven into the very fabric of progress.”
“Critically important questions of risk management cannot be addressed by technical experts with conventional tools of prediction. Such questions determine not only whether we will get sick or die, and under what conditions, but also who will be affected and how we should live with uncertainty and ignorance. Is it sufﬁcient, for instance, to assess technology’s consequences, or must we also seek to evaluate its aims? How should we act when the values of scientiﬁc inquiry appear to conﬂict with other fundamental social values? Has our ability to innovate in some areas run unacceptably ahead of our powers of control? Will some of our most revolutionary technologies increase inequality, promote violence, threaten cultures, or harm the environment? And are our institutions, whether national or supranational, up to the task of governing our dizzying technological capabilities?”
According to Jasanoff, effective technology management needs to go far beyond the “speaking truth to power” paradigm that still seems to link knowledge to power. And in particular, greater accountability in the production and use of scientific knowledge is essential.
“Accountability in one or another form is increasingly seen as an independent criterion for evaluating scientiﬁc research and its technological applications, supplementing more traditional concerns with safety, efﬁcacy, and economic efﬁciency.”
But how can new approaches to establishing and ensuring accountability be developed within the constrains of existing ways of doing business? Jasanoff argued back in 2003 that the time was ripe for seriously re-evaluating existing models and approaches. And at the close of 2008, her recommendations are all the more pertinent for a lack of enlightened progress in the intervening years.
From this starting point, Jasanoff develops the idea of “technologies of humility”—“social technologies” developed around a framework that poses “the questions we should ask of almost every human enterprise that intends to alter society: what is the purpose; who will be hurt; who beneﬁts; and how can we know?” These are presented as a counter-balance to what she refers to as the modern reliance on “technologies of hubris”—a command and control approach to science and technology that seeks to clear the way for science-driven innovation. Instead, Jasanoff reasons that
“there is a need for ‘technologies of humility’ to complement the predictive approaches: to make apparent the possibility of unforeseen consequences; to make explicit the normative that lurks within the technical; and to acknowledge from the start the need for plural viewpoints and collective learning.”
In developing her ideas, Jasanoff highlights problems that continue to plague the sustainable development of emerging technologies—especially when it comes to addressing and managing potential risks. In discussing the limitations of conventional peer review in the context of oversight and risk management, she notes that a spate of highly-publicized cases of alleged fraud in science in the 1980’s showed that
“regulatory science, produced to support governmental efforts to guard against risk, was fundamentally different from research driven by scientists’ collective curiosity.”
This is a lesson that the US government still seems to be struggling with—at least when it comes to nanotechnology—if the recent report from the National Academies of Science is anything to go by.
The issue of peer-review opens up the question of how science should be evaluated within different contexts. Jasanoff remarks that, as new approaches to knowledge production are developed, so new ways of assessing quality are needed.
“Besides old questions about the intellectual merits of their work, scientists are being asked to answer questions about marketability, and the capacity of science to promote harmony and welfare.”
This is challenging the old way of doing things, and raising the need for new ways of ensuring socially responsive and responsible science and technology. As Jasanoff points out, “science that draws strength from it’s socially-detached position is too frail to meet the pressures put upon it by modern society.”
The overarching message here—and Jasanoff delves deeper into the problems and potential solutions than these notes reflect—is that new approaches are needed to partnering with society in the science and technology enterprise. And she reflects that
“while national governments are scrambling to create new participatory forms, there are signs that such changes may reach neither far enough nor deeply enough to satisfy the citizens of a globalizing world.”
Sobering words that are, if anything, more relevant now than they were five years ago.
But what is the solution? Jasanoff develops four focal points for socially relevant and responsible science and technology—framing, vulnerability, distribution and learning. These are packed terms, and you really need to read the paper to understand better what she is proposing. But here are some pointers:
Framing: The quality of solutions to social problems depends on the way they are framed. Get the framing wrong, and the solutions suffer. Jasanoff argues that frame analysis—how you define and approach a problem—is a critically important yet neglected tool for policy-making, which would benefit from greater public input.
Vulnerability: Population-based approaches to risk assessment and management typically overlook the condition and perspectives of individuals, and in doing so underplay the importance of various socio-economic factors. Jasanoff notes that through participation in the analysis of their own vulnerability, ordinary citizens may regain their status as active subjects, rather than remain objects in yet another expert discourse.
Distribution: Issues here stem from “end-of pipe” approaches to legitimizing science and technology advances, and disconnects between groups that benefit from advances, and those that pay for them. Jasanoff suggests that sustained interactions between decision-makers, experts and citizens, starting at the upstream end of research and development, could yield significant dividends in exposing the distributive implications of innovation.
Learning: There’s a tendency within the science and technology community to think that increased learning reduces divergence in opinions—as if there is one true “answer,” and more learning is the means to discovering it (see Kahan el al. in particular on this). But as Jasanoff points out, experience is subject to many interpretations—as much in policy-making as in literary or historical analysis. In other words, while the science might be clear, the decisions it leads to rarely are. Jasanoff recommends that new avenues be designed through which societies can collectively reflect on the ambiguity of their experiences, and assess the strengths and weaknesses of alternative explanations.
Looking through Jasanoff’s recommendations, her emphasis on citizen participation in governing science and technology comes to the fore. It is clear—from her perspective—that old-style command and control models of science and technology innovation no longer work, and that change is needed.
Sadly, in the US at least, we seem no closer to making progress than we were five years ago. The recent National Academies report on the US government’s nanotechnology risk-research strategy indicated that, despite huge efforts to get things right within the federal government, outmoded paradigms and bureaucratic constraints undermined the whole process. And movement on citizen participation in governing nanotechnology is near non-existent—despite clear calls for progress to be made in the 2003 Twenty First Century nanotechnology R&D Act.
And nanotechnology provides just one example—emerging technologies like synthetic biology, and the convergence between nanotech, biotech and information tech, are poised to stress the system to a far greater extent than nanotechnology alone has so far done. How then will our “technologies of hubris” cope?
The solution is to rethink the interface—or contract if you like—between science and society. When better to start this process of rethinking than with a fresh new science and technology-focused administration. And where better to start with Jasanoff’s technologies of humility.
And those three papers that started this rather side-tracked discussion? I must beg Dan, Dietram and Nick’s forgiveness because, excellent and relevant as their papers are, I have run out of space!
Instead, I would direct you to Richard Jones’ excellent Nature editorial on the three papers, together with his blog at Soft Machines. Or if you prefer a raunchier style of commentary, check out Tim Harpur’s thoughts at TNTlog.
And as you read both the papers and the commentaries, think about what might need to change for these insights to lead to more socially integrated science and technology development.
The three Nature Nanotechnology papers I woefully neglected to comment on are:
Pidgeon, N., Harthorn, B. H., Bryant, K. and Rogers-Hayden, T. (2008). Deliberating the risks of nanotechnologies for energy and health applications in the United States and United Kingdom. Nature Nanotechnology DOI: 10.1038/NNANO.2008.362.
Scheufele, D. A., Corley, E. A., Shih, T.-J., Dalrymple, K. E. and Shirley S. Ho, S. S. (2008). Religious beliefs and public attitudes toward nanotechnology in Europe and the United States. Nature Nanotechnology DOI: 10.1038/NNANO.2008.361.
Kahan, D. M., Braman, D., Slovic, P., Gastil, J. and Cohen, G. (2008). Cultural cognition of the risks and beneﬁts of nanotechnology. Nature Nanotechnology DOI: 10.1038/NNANO.2008.341.
Sheila Jasanoff’s 2003 paper is:
Jasanoff, S. (2003). Technologies of humility: Citizen participation in governing science. Minerva 41:223-244. DOI: 10.1023/A:1025557512320