Study Warns of Risks in Deploying AI Systems for Humanitarian Purposes

A new study conducted by Professor Ana Beduschi from the University of Exeter Law School has raised concerns about the deployment of AI systems in humanitarian contexts. While AI has proven to be a valuable tool for humanitarians in monitoring and anticipating risks, the study warns that certain uses of AI may expose affected populations to additional harm and pose significant challenges to the protection of their rights.

Humanitarian organizations have increasingly embraced AI technology, with the COVID-19 pandemic accelerating this trend. Examples include the use of AI-supported disaster mapping in Mozambique to expedite response efforts and the World Bank’s rollout of AI systems to predict food crises across twenty-one countries.

However, the study emphasizes the need for safeguards to prevent AI systems from becoming tools of exclusion for populations in need of assistance. It highlights the importance of respecting and protecting data privacy, as well as ensuring that AI systems are designed to minimize risks of harm. Data protection impact assessments should be conducted to understand the potential negative impacts of these technologies.

The study also calls for the establishment of grievance mechanisms, allowing individuals to challenge decisions made by AI systems that adversely impact them. Transparency in AI decision-making and addressing concerns about algorithmic bias are crucial aspects that need to be addressed.

Professor Beduschi underlines that while AI systems can provide valuable insights by analyzing large amounts of data, their deployment in humanitarian contexts carries risks for affected populations. Issues such as poor data quality, algorithmic bias, lack of transparency, and data privacy concerns need to be carefully addressed.

The humanitarian imperative of “do no harm” should guide the decision-making process when considering the deployment of AI systems. In some cases, it may be more prudent not to rely on AI technologies, as they could potentially cause additional harm to civilian populations.