By Penny Carmichael, on 14 October 2014
-Article by Stephen Leach
This week there was no CPS lecture. Instead, entertainment came in the form of ‘The Fly’ by David Cronenberg. It’s a 1986 sci-fi horror masterpiece in which Mr Cronenberg charts the epic destruction of a brilliant scientist played by Jeff Goldblum who became perilously entangled with the common fly. This film explores how the initial improvements which accompany a scientific breakthrough are followed by dangerous uncontrollable consequences. Much like the martyred Miles Dyson, the computer scientist who created the circuitry that would go on to become the twisted brain of ‘The Terminator’. In the Terminator franchise, robots became so advanced and clever that they wanted to destroy the foolish humans who created them, (all time travelling paradoxes aside).
Yes these technologies are make believe and are accepted for dramatic effect. However the underlying theme has some reality. There is probably a quote from Jurassic Park which could sum up everything I’m getting at here but that would be too simple.
The question is: do science researchers inherit a greater amount of moral responsibility during their training than they sign on for? The film industry certainly says ‘yes’. The mass media says ‘yes’. Scientific findings are reported to the general public via news agencies with minimal reference to any of the assumptions made or experimental context. Once in the news, those findings from a controlled environment go on to directly influence people!
History also agrees in one very extreme case, which is depicted in a 1980 documentary by Jon Else, ‘The Day After Trinity’. Freely available online, it features a collection of interviews with people who had worked on the Manhattan Project or knew J. Robert Oppenheimer. Fifty years on from Hiroshima 1967 Nobel physics prize winner Hans Bethe, head of the Los Alamos Theoretical Division remarked “Individual scientists can aid nuclear disarmament by withholding their skills…” Even to discuss the weapon they created is to enter into a sticky moral debate.
As a chemistry undergraduate the closest I get to a moral issue is the use of excessive amounts of water in the teaching labs, or by accidentally misusing the waste recycling facilities around campus, (thankfully the chemistry department is addressing the former and I’m paying closer attention to the bins). Most people, researchers or otherwise will get through careers and life without becoming embroiled in any serious moral dilemmas. Is that due to a lack of them? Or due to a haziness around accountability and how empowered or disempowered we feel when faced with one? You only have to enter a supermarket to be faced with an ethical decision and discover that our system allows the ‘ethically sound’ shopping basket to cost more than its battery farmed equivalent. Stop giving me the choice to be morally lazy!
I’m confident that the majority of students embarking on scientific educations are striving to make the world a better place and trust in the academics we’ve encountered here at UCL to help us do so. Scientific research does transform society for the better. The more scientists delve into non quantitative moral issues, the more religious/political leaders cease to be the sole governors of morality.
Personally, I don’t even have the guts to tell someone to pick up their litter. ‘Doing my bit’ for society involves a highly expensive and long education. Thus are the complexities of procrastination.
This blog entry by the film maker Adam Curtis via the BBC has discussed this theme far more coherently than myself (or Jurassic Park), so take a look for some food for thought:
“Science and scientists do all kinds of wonderful things. But when they venture into the social and political world they tend to get bent the way the ideological wind is blowing.”
If scientists are not bent by the ‘ideological wind’, then are they not at risk of ignoring the society which they are meant to serve?