The first of the Sustainable Development Goals refers directly to the need to “eradicate poverty in all its forms worldwide”. According to data from the United Nations, 836 million people live in extreme poverty and one in five in developing countries survive on less than 1.25 dollars per day. How can technology help detect poverty in the world in order to contribute to eradicating it?
We already know of many applications of recently-introduced technologies, such as drones and Big Data, for widely-varying aspects of sustainable development, from improving agricultural monitoring to fighting forest fires, in the case of unmanned aerials vehicles, to the gathering of all kinds of data about the weather, the economy, social welfare, employment, etc., in the case of Big Data – and others exist.
Satellites for uncovering poverty in remote areas
One of the most recent successes in studying poverty with new technologies is the use of satellites to detect depressed, difficult-to-access areas so that governments can better target their efforts and resources: economic, material, personnel or otherwise.
What kind of satellite-gathered data helps detect poverty? Images such as unsurfaced roads, extensive plantations and livestock operations, water resources, the condition of roofs of housing in towns, cities and the countryside… these are all factors that help discern whether an area is prosperous or poor.
Researchers from the Department of Sciences at Stanford University launched a project to employ satellites to gather all kinds of data. Through powerful algorithms, they created maps in developing countries showing up places that were difficult to access by land and needed help urgently.
Neal Jean, one of the authors of the study, commented: “The surveys follow the traditional method for gathering data on poverty, but it’s rare this is reliably done in most developing countries. Other less important data, collected systematically but not for this purpose, can also be used to shed light on poverty and wealth.”
Millions of images at night and day
With respect to older studies aiming to measure the degree of local poverty from satellites, the Stanford project has achieved a breakthrough by comparing data from night-time and daytime images. Studying several regions of Sub-Saharan Africa, the method combines a novel way of collecting images at different times of the day and night to generate ‘impact maps’ that automatically differentiate between areas of poverty and extreme poverty.
It involves a mapping technique that generates millions of hi-resolution satellite images, analyzed by researchers using machine learning, a discipline of artificial intelligence where raw data, and separately the objective of the study, are introduced into a computational model which then designs an algorithm to solve the problem without direct human intervention. “Basically, we provided the machine-learning system with daytime and night-time satellite imagery and asked it to make predictions on poverty,” explained professor Stefano Ermon. “The system essentially learned how to solve the problem by comparing those two sets of images.”
An object of daily use in developed countries, such as a smartphone, can also become a great ally for sustainability in the detection of poverty. A study by Science magazine showed how, by sourcing anonymous metrics from cellular device use in developing countries – the pilot study for this was in Rwanda – data could be gathered throwing light on social relations, displacement patterns and the use and consumption of goods and services. The large quantity of data accumulated reveals a new picture of poverty in the areas studied.
Neal Jean believes “combining data from telephone records and satellite images” could help estimate poverty with a high degree of certainty in areas of the planet where the usual studies cannot be carried out, allowing the delivery of aid to those who most need it in any given moment.