Blog post
More late lessons from early warnings
The intriguing disposition of the cover was, admittedly, reversed by the headline of the briefing contained within: ‘Has the ideas machine broken down?’ (‘No’ – probably.) Nevertheless, technological optimism and neophilia have taken something of a battering recently, particularly from those inspecting emerging technologies for auguries of economic growth.
Prediction is, uncontroversially, a difficult business when it comes to the behaviour of complex systems like economies, human societies and natural environments. Bring these things together, though, and it’s hard to place faith in anyone who claims to see clearly more than a couple of steps ahead. But this intersection is where a number of critical decisions are made with long term implications that are often only partially reversible (if reversible at all). What makes these decisions seem possible at all is often a particular framing, which is to say, a repression of their complexity. And what makes them important is that real people are helped and harmed. And because these helps or harms occur, other helps and harms, which might have occurred, and might have accrued to different people, do not. The European Environment Agency has recently published a second volume of Late Lessons from Early Warnings, following up their original 2001 study of technologies whose introduction had been followed by unintended and unwelcome consequences.As the EEA news release says, the new report contains case studies with ‘with far-reaching implications for policy, science and society’ including “industrial mercury poisoning; fertility problems caused by pesticides; hormone-disrupting chemicals in common plastics; and pharmaceuticals that are changing ecosystems. The report also considers the warning signs emerging from technologies currently in use, including mobile phones, genetically modified organisms and nanotechnology.”
The reasons that we have these lessons to learn can sometimes be found in the framing of decisions to act: “in some instances, companies put short-term profits ahead of public safety, either hiding or ignoring the evidence of risk. In others, scientists downplayed risks, sometimes under pressure from vested interests.” The first volume of Late Lessons was cited in our recent report Emerging Biotechnologies: technology, choice and the public good. The theme of responding to intractable uncertainty – and not just narrowly conceived, quantifiable risk – in complex and sensitive innovation contexts was one of the three challenges of emerging biotechnologies the report identified. Like Late Lessons, it emphasised the virtues of caution, of openness and inclusion, and of enablement, highlighting alternative social and technological choices and their implications. These virtues, among others, became key elements of the ethical approach we developed in our report. While the EEA authors understandably foreground harms to health and the environment, the agglomeration of narratives implicitly reminds us that there are social, economic and geopolitical consequences that emanate from these technologies, just as social, economic and political interests combine to frame and condition the pursuit of particular technological options (and the neglect of others). Caution means not only attending to other possibilities, but also to how those possibilities are differently valued, their ambiguity – the second challenge associated with emerging technologies in our report. Late Lessons also appears to advance the importance of public involvement. As the admirably clear 40-page summary has it (the report itself is 750 pages long):“Involving the public can also help in choosing between those innovation pathways to the future; on prioritising relevant public research; on providing data and information in support of monitoring and early warnings; improving risk assessments; on striking appropriate trade-offs between innovations and plausible health and environmental harms; and, making decisions about risk-risk trade-offs.”
This is a timely reminder for those of us who have been observing the current debate about the meaning of ‘open policy making’ and the use of expert advice, about the value of opening the door to admit voices not represented among established academic, industrial and political elites.
The Emerging Biotechnologies report proposed a ‘public discourse ethics’ for a liminal historical moment, haunted by the spectre of catastrophe, and resonant with caution. ‘Public discourse ethics’ in this sense does not mean ethics done in public or by the public; it does not mean just more ‘public engagement’ and it certainly does not mean just more committees of expert advisors. It involves the promotion of public values as fundamentally and explicitly constitutive of the ground on which the engagements between knowledges and interests that determine technological trajectories and conditions of innovation take place: the way they are framed. Its conditions are the cultivation of procedural and institutional virtues such as openness and inclusion, enablement, public reasoning and caution. The Council is inviting further discussion on operationalising the public discourse ethics in relation to emerging biotechnology contexts and will be exploring this theme through a number of workshops in the first half of this year.
Comments (0)
Join the conversation