X Close

Open@UCL Blog

Home

Menu

Ethics of Open Science: Managing dangers to scientists

By Kirsty, on 5 February 2025

Guest post by Ilan Kelman, Professor of Disasters and Health, building on his captivating presentation in Session 2 of the UCL Open Science Conference 2024.

Open Science brings potential dangers to scientists and ways of managing those dangers. In doing so, opportunities emerge to show the world the harm some people face, such as the murder of environmental activists and investigations of child sexual abuse, hopefully leading to positive action to counter these problems.

Yet risks can appear for scientists. Even doing basic climate change science has led to death threats. Two examples in this blog indicate how to manage dangers to scientists.

Disaster diplomacy

Disaster diplomacy research examines how and why disaster-related activities—before, during, and after a disaster—do and do not influence all forms of conflict and cooperation, ranging from open warfare to signing peace deals. So far, no example has been identified in which disaster-related activities, including a major calamity, led to entirely new and lasting conflict or cooperation. An underlying reason to favour enmity or amity is always found, with disaster-related activities being one reason among many to pursue already decided politics.

The 26 December 2004 tsunamis around the Indian Ocean devastated Sri Lanka and Aceh in Indonesia, both of which had been wracked by decades of violent conflict. On the basis of ongoing, secret negotiations which were spurred along by the post-earthquake/tsunami humanitarian effort, a peace deal was reached in Aceh and it held. Simultaneously in Sri Lanka, the disaster relief was deliberately used to continue the conflict which was eventually ended by military means. In both locations, the pre-existing desire for peace and conflict respectively produced the witnessed outcome.

This disaster diplomacy conclusion is the pattern for formal processes, such as politicians, diplomats, celebrities, businesses, non-governmental organisations, or media leading the work. It is less certain for informal approaches: individuals helping one another in times of need or travelling to ‘enemy states’ as tourists or workers—or as scientists.

Openly publishing on disaster diplomacy could influence conflict and cooperation processes by suggesting ideas which decision-makers might not have considered. Or it could spotlight negotiations which detractors seek to scuttle. If a scientist had published on the closed-door Aceh peace talks, the result might have emulated Sri Lanka. The scientist would then have endangered a country as well as themselves by being blamed for perpetuating the violence.

Imagine if South Korea’s President, seeking a back door to reconciliation with North Korea, sends to Pyongyang flood engineers and scientists who regularly update their work online. They make social gaffes, embarrassing South Korea, or are merely arrested and made scapegoats on the whim of North Korea’s leader who is fed up with the world seeing what North Korea lacks. The scientists and engineers are endangered as much as the reconciliation process.

Open Science brings disaster diplomacy opportunities by letting those involved know what has and has not worked. It can lead to situations in which scientists are placed at the peril of politics.

Figure 1: Looking across the Im Jin River into North Korea from South Korea (photo by Ilan Kelman).

Underworlds

Scientists study topics in which people are in danger, such as child soldiers, human trafficking, and political movements or sexualities that are illegal in the country being examined. The scientists can be threatened as much as the people being researched. In 2016, a PhD student based in the UK who was researching trade unions in Cairo was kidnapped, tortured, and murdered.

In 2014, a PhD student based in the UK was one of a group placed on trial in London for ‘place-hacking’ or ‘urban exploring’ (urbex), in which they enter or climb disused or under-construction infrastructure. Aside from potentially trespassing, these places are often closed for safety reasons. The scientist places themselves in danger to research this subculture on-site, in action.

All these risks are manageable and they are managed. Any such research in the UK must go through a rigorous research ethics approval process alongside a detailed risk assessment. This paperwork can take months, to ensure that the dangers have been considered and mitigated, although when conducted improperly, the process itself can be detrimental to research ethics.

Many urbex proponents offer lengthy safety advice and insist that activities be conducted legally. Nor should researchers necessarily shy away from hard subject matter because a government dislikes the work.

Open Science publishing on these topics can remain ethical by ensuring anonymity and confidentiality of sources as well as not publishing when the scientist is in a place where they could be in danger. This task is not always straightforward. Anonymity and confidentiality can protect criminals. Scientists might live and work in the country of research, so they cannot escape the danger. How ethical is it for a scientist to be involved in the illegal activities they are researching?

Figure 2: The Shard in London, a desirable  place for ‘urban exploring’ when it was under construction (photo by Ilan Kelman).

Caution, care, and balance

Balance is important between publishing Open Science on topics involving dangers and not putting scientists or others at unnecessary peril while pursuing the research and publication. Awareness of the potential drawbacks of doing the research and of suitable research ethics, risk assessments, and research monitoring can instil caution and care without compromising the scientific process or Open Science.

Ethics of Open Science: Managing dangers to the public

By Kirsty, on 17 December 2024

Guest post by Ilan Kelman, Professor of Disasters and Health, building on his captivating presentation in Session 2 of the UCL Open Science Conference 2024.

Open Science brings risks and opportunities when considering dangers to the public from conducting and publishing science. Opportunities mean detailing societal problems and responses to them, which could galvanise positive action to make society safer. Examples are the effectiveness of anti-bullying techniques, health impacts from various paints, and companies selling cars they knew were dangerous.

Risks emerge if pursuing or publicising research might change or create dangers to the public. Highlighting how pickpockets or street scams operate help the public protect themselves, yet could lead the perpetrators to changing their operations, making them harder to detect. Emphasising casualties from cycling could lead to more driving, increasing the health consequences from air pollution and vehicle crashes.

The latter might be avoided by comparing cycling’s health benefits and risks, including with respect to air pollution and crashes. Meanwhile, understanding pickpocketing awareness and prevention should contribute to reducing this crime over the long-term, if people learn from the science and take action.

In other words, context and presentation matter for risks and opportunities from Open Science regarding dangers to the public. Sometimes, though, the context is that science can be applied nefariously.

Explosives research

Airplane security is a major concern for travellers, with most governments implementing stringent measures at airports and in the air. Legitimate research questions for public safety relate to smuggling firearms through airport security and the bomb resistance of different aircraft.

Fiction frequently speculates, including in movies. A Fish Called Wanda showed a loaded gun getting past airport security screening while Non-Stop portrayed a bomb placed aboard a commercial flight.

Desk analyses could and should discuss these scenes’ dramatism and level of realism, just as the movies are analysed in other ways. Scientists could and should work with governments, security organisations, airport authorities, and airline companies to understand threats to aviation and countering them.

Open Science could compromise the gains from this collaboration. It could reveal the bomb type required to breach an airport’s fuselage or the key ways to get a weapon on board. The satirical news service, The Onion, lampooned the presumption of publicising how to get past airport security.

The front half of an aeroplane. The engines can be seen on the left of the image and the nose nearly reaches the right side of the image. The plane is white and labeled with Lufthansa.

Figure 1: We should research a cargo hold’s explosion resistance, but why publicise the results? (photo by Ilan Kelman).

Endangering activists

The public can endanger themselves by seeking Open Science. I ran a research project examining corporate social responsibility for Arctic petroleum with examples in Norway and Russia. In one Russian site, locals showed our researcher decaying oil and gas infrastructure, including leaks. These local activists were assured of confidentiality and anonymity, which is a moral imperative as well as a legal requirement.

Not all of them supported this lack of identification. They preferred entirely Open Science, hoping that researchers outside of Russia would have the credibility and influence to generate action for making their community and environment safer and healthier. They were well aware of the possible consequences of them being identified (or of publicising enough information to make them identifiable). They were willing to take these risks, hoping for gain.

The top of a square tower built of bright red brick. The tower has a narrow section on top and a green pointed roof.

Figure 2: Trinity Tower, the Kremlin, Moscow, Russia during petroleum research (photo by Ilan Kelman).

We were not permitted to accede to their requests. We certainly published on and publicised our work, using as much Open Science as we could without violating our research ethics approval, as both an ethical and legal duty. We remain inspired and concerned that the activists, seeking to save their own lives, could pursue citizen science which, if entirely open as some of them would prefer, could place them in danger.

Caution, care, and balance

Open Science sometimes brings potential dangers to the public. Being aware of and cautious about these problems means being able to prevent them. Then, a balance can be achieved between needing Open Science and not worsening or creating dangers.

Ethics of Open Science: Privacy risks and opportunities

By Kirsty, on 22 November 2024

Guest post by Ilan Kelman, Professor of Disasters and Health, building on his captivating presentation in Session 2 of the UCL Open Science Conference 2024.

Open Science brings risks and opportunities regarding privacy. Making methods, data, analyses, disagreements, and conclusions entirely publicly available demonstrates the scientific process, including its messiness and uncertainties. Showing how much we do not know and how we aim to fill in gaps excites and encourages people about science and scientific careers. It also holds scientists accountable, since any mistakes can be identified and corrected, which is always an essential part of science.

Given these advantages, Open Science offers so much to researchers and to those outside research. It helps to make science accessible to anyone, notably for application, while supporting exchange with those inspired by the work.

People’s right to privacy, as an ethical and legal mandate, must still be maintained. If a situation might worsen by Open Science not respecting privacy, irrespective of it being legal, then care is required to respect those who would want or might deserve privacy. Anonymity and confidentiality are part of research ethics precisely to achieve a balance. Irrespective, Open Science might inadvertently reveal information sources or it could be feasible to identify research participants who would prefer not to be exposed. Being aware of possible pitfalls assists in preventing them.

Disaster decisions

Some research could be seen as violating privacy. Disaster researchers seek to understand who dies in disasters, how, and why, in order to improve safety for everyone and to save lives. The work can examine death certificates and pictures of dead bodies. Publicising all this material could violate the privacy and dignity of those who perished and could augment the grief of those left behind.

Sometimes, research hones in on problematic actions for improving without blaming, whereas society more widely might seek to judge. A handful of studies has examined the blood alcohol level of drivers who died while driving through floodwater, which should never be attempted even when sober (Figure 1). In many cases, the driver was above the legal limit for blood alcohol level. Rather than embarrassing the deceased by naming-and-shaming, it would help everyone to use the data as an impetus to tackle simultaneously the separate and unacceptable decisions to drive drunk, to drive drugged, and to drive through floodwater.

Yet storytelling can be a powerful communication technique to encourage positive behavioural change. If identifying details are used, then it must involve the individuals’ or their kin’s full and informed consent. Even with this consent, it might not be necessary to provide the full details, as a more generic narrative can remain emotional and effective. Opportunities for improving disaster decisions emerge in consensual sharing, so that it avoids violating privacy—while also being careful regarding the real need to publish the specifics of any particular story.

Photo by Ilan Kelman researching the dangerous behaviour of people driving through floodwater. A white car drives through a flooded road, creating a splash. Bare trees line the roadside under a clear sky, and a road sign is partially submerged in water.
Figure 1: Researching the dangerous behaviour of people driving through floodwater, with the number plate blurred to protect privacy (photo by Ilan Kelman).

Small sample populations

Maintaining confidentiality and anonymity for interviewees can be a struggle where interviewees have comparatively unique experiences or positions and so are easily identifiable. Governments in jurisdictions with smaller populations might employ only a handful of people in the entire country who know about a certain topic. Stating that an interviewee is “A national government worker in Eswatini specialising in international environmental treaties” or “A megacity mayor” could narrow it down to a few people or to one person.

A similar situation arises with groups comprising a small number of people from whom to select interviewees, such as “vehicle business owners in Kiruna, Sweden”, “International NGO CEOs”, or specific elites. Even with thousands of possible interviewees, for instance “university chiefs” or “Olympic athletes”, quotations from the interview or locational details might make it easy to narrow down and single out a specific interviewee.

Interviewee identification can become even simpler when basic data on interviewees, such as sex and age range, are provided, as is standard in research papers. Providing interview data in a public repository is sometimes expected, with the possibility of full transcripts, so that others can examine and use those data. The way someone expresses themselves might make them straightforward to pinpoint within a small group of potential interviewees.

Again, risks and opportunities regarding privacy focus on consent and on necessity of listing details. Everyone including any public figure has some level of a right to privacy (Figure 2). Where consent is not given to waive confidentiality or anonymity, then the research process—including reviewing and publishing academic papers—needs to accept that not all interviewee details or data can or should be shared. With consent, care is still required to ensure that identifying individuals or permitting them to be discovered really adds to the positive impacts from the research.

The photo captures Ralph Nader, American politician, author, and consumer advocate, mid-speech at a podium. His expression is earnest and determined as he addresses the audience. He is dressed in a suit and tie, with a brown brick wall behind him. He is speaking towards a microphone.
Figure 2: Ralph Nader, an American politician and activist, still has a right to privacy when not speaking in public (photo by Ilan Kelman).

Caution, care, and balance

With caution and care, always seeking a balance with respect to privacy, any difficulties emerging from Open Science can be prevented. Of especial importance is not sacrificing many of the immense and much-needed gains from Open Science.