X Close

UCL Centre for Law and Environment


UCL Laws


Precaution in the Governance of Technology

By Daniela Bragato, on 3 July 2017

Centre for Law and the Environment’s Annual Lecture

Tuesday 31 October 2017, 18:00 – 19:00

UCL Roberts 106 LT, Roberts Building, Torrington Place, London WC1E 7J

Keyboard-and-Gavel 2-610x340
Speaker: Professor Andrew Stirling (University of Sussex)
Chair: Professor Maria Lee (University College London)

About the lecture:

Worldwide policy debates over governance of technology are pervaded by apparent tensions. One of the most intense and protracted sites for controversy surrounds the role of ‘the precautionary principle’ in research, regulation and international standard setting. A common – often loudly propounded – position in influential quarters of business, government and academia, is that precaution is somehow ‘unscientific’ or even ‘anti-technology’ in its implications. Such interests strongly assert the sufficiency of ‘risk-based’ decision making, treating choices among alternative directions for innovation in particular fields as if they were effectively purely technical – independent of political values, economic interests or democratic process.

It is clear that (as in any politically-salient field of scholarship or law), there exist many expedient misrepresentations or misapplications of precaution. In particular, precaution is best understood not as a notionally-definitive decision rule, but as a principle that points towards specific qualities of process. In making this case, this talk will argue that the above kinds of high-profile rhetoric around precaution are not only mistaken, but undermining both of science and democracy in governance of science and technology.

Crucial to understanding why this is so, is to appreciate that the full breadth and depths of incertitude in this field are far more profound and intractable than are routinely acknowledged in established forms of risk assessment. It is not necessarily ‘critical’ – but simply a matter of realism and rigour – to recognise that there exist many institutional pressures to suppress the typical scope and gravity of incertitude and treat it as a reduced notion of risk. The cumulative effect of this is to generate a kind of ‘organised irresponsibility’, under which consequences of neglected aspects of ‘uncertainty’, ‘ambiguity’ and ‘ignorance’ are effectively externalised onto the least privileged (often most vulnerable) social groups and their environments.

Seen in this light, the diverse implications of precaution are not simply about being more rigorous about different aspects of uncertainty. They are also about being more open in seeking to balance the routine effects of powerful interests within processes of technology governance. Precaution also entails a more realistic understanding of innovation as a branching evolutionary process. Here, discouragement of one particular powerfully-backed trajectory in any given can be recognised not to be inherently ‘anti-technology’, but typically to have the effect of encouraging alternative preferable innovation pathways.

It is on these grounds that carefully deliberate application of precaution in some of its many variant forms, can help enable technology governance at the same time not only to be more rigorous about the realities of uncertainty and innovation, but also more respectful of the imperatives of social justice and democracy.

Click here to book your place.