Thursday, November 2, 2023

Jordanian World Bank Aid Program Algorithm Fails the Poorest

Date:

Why Algorithmic Decision-Making in Poverty Relief Programs Can Be Harmful: A Human Rights Watch Report

A new report by Human Rights Watch has found that the algorithmic decision-making used in the Unified Cash Transfer Program, a poverty relief program in Jordan, is failing the very people it is intended to protect. The program, known as Takaful, was put in place by the Jordanian government with the support of the World Bank. The algorithm relies on stereotypes and faulty assumptions about poverty, leading to inaccurate and unreliable data about people’s finances. The formula used by the algorithm flattens the economic complexity of people’s lives into a crude ranking that pits one household against another, fueling social tension and perceptions of unfairness.

The Takaful program is meant to solve a real problem: The World Bank provided the Jordanian state with a multibillion-dollar poverty relief loan, but it’s impossible for the loan to cover all of Jordan’s needs. Without enough cash to cut every needy Jordanian a check, Takaful works by analyzing the household income and expenses of every applicant, along with nearly 60 socioeconomic factors like electricity use, car ownership, business licenses, employment history, illness, and gender. These responses are then ranked — using a secret algorithm — to automatically determine who are the poorest and most deserving of relief.

However, the Human Rights Watch investigation found that car ownership seems to be a disqualifying factor for many Takaful applicants, even if they are too poor to buy gas to drive the car. Similarly, applicants are penalized for using electricity and water based on the presumption that their ability to afford utility payments is evidence that they are not as destitute as those who can’t. The report explains that sometimes electricity usage is high precisely for poverty-related reasons.

Beyond the technical problems with Takaful itself are the knock-on effects of digital means-testing. The report notes that many people in dire need of relief money lack the internet access to even apply for it, requiring them to find, or pay for, a ride to an internet café, where they are subject to further fees and charges to get online. The rigidity of Takaful’s insistence that applicants’ self-reported income match up exactly with their self-reported household expenses forces people to simply fudge the numbers so that their applications would even be processed, undermining the algorithm’s illusion of objectivity.

The report, based on 70 interviews with Takaful applicants, Jordanian government workers, and World Bank personnel, emphasizes that the system is part of a broader trend by the World Bank to popularize algorithmically means-tested social benefits over universal programs throughout the developing economies in the so-called Global South. The Jordanian National Aid Fund, which administers the system, declined to disclose the full list of indicators and the specific weights assigned, saying that these were for internal purposes only and ‘constantly changing.’

While the people who designed Takaful insist on its fairness and functionality, they refuse to let anyone look under the hood. While fantastical visions of “Terminator”-like artificial intelligences have come to dominate public fears around automated decision-making, other technologists argue civil society ought to focus on real, current harms caused by systems like Takaful, not nightmare scenarios drawn from science fiction. So long as the functionality of Takaful and its ilk remain government and corporate secrets, the extent of those risks will remain unknown.

Latest stories