X Close

Open@UCL Blog

Home

Menu

Archive for the 'Ethics' Category

Ethics of Open Science: Managing dangers to the public

By Kirsty, on 17 December 2024

Guest post by Ilan Kelman, Professor of Disasters and Health, building on his captivating presentation in Session 2 of the UCL Open Science Conference 2024.

Open Science brings risks and opportunities when considering dangers to the public from conducting and publishing science. Opportunities mean detailing societal problems and responses to them, which could galvanise positive action to make society safer. Examples are the effectiveness of anti-bullying techniques, health impacts from various paints, and companies selling cars they knew were dangerous.

Risks emerge if pursuing or publicising research might change or create dangers to the public. Highlighting how pickpockets or street scams operate help the public protect themselves, yet could lead the perpetrators to changing their operations, making them harder to detect. Emphasising casualties from cycling could lead to more driving, increasing the health consequences from air pollution and vehicle crashes.

The latter might be avoided by comparing cycling’s health benefits and risks, including with respect to air pollution and crashes. Meanwhile, understanding pickpocketing awareness and prevention should contribute to reducing this crime over the long-term, if people learn from the science and take action.

In other words, context and presentation matter for risks and opportunities from Open Science regarding dangers to the public. Sometimes, though, the context is that science can be applied nefariously.

Explosives research

Airplane security is a major concern for travellers, with most governments implementing stringent measures at airports and in the air. Legitimate research questions for public safety relate to smuggling firearms through airport security and the bomb resistance of different aircraft.

Fiction frequently speculates, including in movies. A Fish Called Wanda showed a loaded gun getting past airport security screening while Non-Stop portrayed a bomb placed aboard a commercial flight.

Desk analyses could and should discuss these scenes’ dramatism and level of realism, just as the movies are analysed in other ways. Scientists could and should work with governments, security organisations, airport authorities, and airline companies to understand threats to aviation and countering them.

Open Science could compromise the gains from this collaboration. It could reveal the bomb type required to breach an airport’s fuselage or the key ways to get a weapon on board. The satirical news service, The Onion, lampooned the presumption of publicising how to get past airport security.

The front half of an aeroplane. The engines can be seen on the left of the image and the nose nearly reaches the right side of the image. The plane is white and labeled with Lufthansa.

Figure 1: We should research a cargo hold’s explosion resistance, but why publicise the results? (photo by Ilan Kelman).

Endangering activists

The public can endanger themselves by seeking Open Science. I ran a research project examining corporate social responsibility for Arctic petroleum with examples in Norway and Russia. In one Russian site, locals showed our researcher decaying oil and gas infrastructure, including leaks. These local activists were assured of confidentiality and anonymity, which is a moral imperative as well as a legal requirement.

Not all of them supported this lack of identification. They preferred entirely Open Science, hoping that researchers outside of Russia would have the credibility and influence to generate action for making their community and environment safer and healthier. They were well aware of the possible consequences of them being identified (or of publicising enough information to make them identifiable). They were willing to take these risks, hoping for gain.

The top of a square tower built of bright red brick. The tower has a narrow section on top and a green pointed roof.

Figure 2: Trinity Tower, the Kremlin, Moscow, Russia during petroleum research (photo by Ilan Kelman).

We were not permitted to accede to their requests. We certainly published on and publicised our work, using as much Open Science as we could without violating our research ethics approval, as both an ethical and legal duty. We remain inspired and concerned that the activists, seeking to save their own lives, could pursue citizen science which, if entirely open as some of them would prefer, could place them in danger.

Caution, care, and balance

Open Science sometimes brings potential dangers to the public. Being aware of and cautious about these problems means being able to prevent them. Then, a balance can be achieved between needing Open Science and not worsening or creating dangers.

Ethics of Open Science: Privacy risks and opportunities

By Kirsty, on 22 November 2024

Guest post by Ilan Kelman, Professor of Disasters and Health, building on his captivating presentation in Session 2 of the UCL Open Science Conference 2024.

Open Science brings risks and opportunities regarding privacy. Making methods, data, analyses, disagreements, and conclusions entirely publicly available demonstrates the scientific process, including its messiness and uncertainties. Showing how much we do not know and how we aim to fill in gaps excites and encourages people about science and scientific careers. It also holds scientists accountable, since any mistakes can be identified and corrected, which is always an essential part of science.

Given these advantages, Open Science offers so much to researchers and to those outside research. It helps to make science accessible to anyone, notably for application, while supporting exchange with those inspired by the work.

People’s right to privacy, as an ethical and legal mandate, must still be maintained. If a situation might worsen by Open Science not respecting privacy, irrespective of it being legal, then care is required to respect those who would want or might deserve privacy. Anonymity and confidentiality are part of research ethics precisely to achieve a balance. Irrespective, Open Science might inadvertently reveal information sources or it could be feasible to identify research participants who would prefer not to be exposed. Being aware of possible pitfalls assists in preventing them.

Disaster decisions

Some research could be seen as violating privacy. Disaster researchers seek to understand who dies in disasters, how, and why, in order to improve safety for everyone and to save lives. The work can examine death certificates and pictures of dead bodies. Publicising all this material could violate the privacy and dignity of those who perished and could augment the grief of those left behind.

Sometimes, research hones in on problematic actions for improving without blaming, whereas society more widely might seek to judge. A handful of studies has examined the blood alcohol level of drivers who died while driving through floodwater, which should never be attempted even when sober (Figure 1). In many cases, the driver was above the legal limit for blood alcohol level. Rather than embarrassing the deceased by naming-and-shaming, it would help everyone to use the data as an impetus to tackle simultaneously the separate and unacceptable decisions to drive drunk, to drive drugged, and to drive through floodwater.

Yet storytelling can be a powerful communication technique to encourage positive behavioural change. If identifying details are used, then it must involve the individuals’ or their kin’s full and informed consent. Even with this consent, it might not be necessary to provide the full details, as a more generic narrative can remain emotional and effective. Opportunities for improving disaster decisions emerge in consensual sharing, so that it avoids violating privacy—while also being careful regarding the real need to publish the specifics of any particular story.

Photo by Ilan Kelman researching the dangerous behaviour of people driving through floodwater. A white car drives through a flooded road, creating a splash. Bare trees line the roadside under a clear sky, and a road sign is partially submerged in water.
Figure 1: Researching the dangerous behaviour of people driving through floodwater, with the number plate blurred to protect privacy (photo by Ilan Kelman).

Small sample populations

Maintaining confidentiality and anonymity for interviewees can be a struggle where interviewees have comparatively unique experiences or positions and so are easily identifiable. Governments in jurisdictions with smaller populations might employ only a handful of people in the entire country who know about a certain topic. Stating that an interviewee is “A national government worker in Eswatini specialising in international environmental treaties” or “A megacity mayor” could narrow it down to a few people or to one person.

A similar situation arises with groups comprising a small number of people from whom to select interviewees, such as “vehicle business owners in Kiruna, Sweden”, “International NGO CEOs”, or specific elites. Even with thousands of possible interviewees, for instance “university chiefs” or “Olympic athletes”, quotations from the interview or locational details might make it easy to narrow down and single out a specific interviewee.

Interviewee identification can become even simpler when basic data on interviewees, such as sex and age range, are provided, as is standard in research papers. Providing interview data in a public repository is sometimes expected, with the possibility of full transcripts, so that others can examine and use those data. The way someone expresses themselves might make them straightforward to pinpoint within a small group of potential interviewees.

Again, risks and opportunities regarding privacy focus on consent and on necessity of listing details. Everyone including any public figure has some level of a right to privacy (Figure 2). Where consent is not given to waive confidentiality or anonymity, then the research process—including reviewing and publishing academic papers—needs to accept that not all interviewee details or data can or should be shared. With consent, care is still required to ensure that identifying individuals or permitting them to be discovered really adds to the positive impacts from the research.

The photo captures Ralph Nader, American politician, author, and consumer advocate, mid-speech at a podium. His expression is earnest and determined as he addresses the audience. He is dressed in a suit and tie, with a brown brick wall behind him. He is speaking towards a microphone.
Figure 2: Ralph Nader, an American politician and activist, still has a right to privacy when not speaking in public (photo by Ilan Kelman).

Caution, care, and balance

With caution and care, always seeking a balance with respect to privacy, any difficulties emerging from Open Science can be prevented. Of especial importance is not sacrificing many of the immense and much-needed gains from Open Science.