What flying a drone above the Agung volcano in Bali teaches us about the computerisation of the earth

What flying a drone above the Agung volcano in Bali teaches us about the computerisation of the earth

For the 100,000 or so people who had to leave their homes last month, and the equal numbers of travellers stuck on, or unable to get to Bali—the eruption of the Agung volcano has been devastating.

But this has been a fascinating time for a scholar like myself who investigates the use of drones—unmanned and unwomanned aerial vehicles—in social justice, environmental activism, and crisis preparedness.

Amazon drone delivery is developing in the UK, drone blood delivery is happening in Rwanda, while here in Indonesia people are using drones to monitor orangutan populations, map the growth and expansion of palm oil plantations, and to gather information that might help us predict when volcanos such as Agung on the Indonesian island of Bali might again erupt with devastating impact.

In Bali, I have the pleasure of working with a remarkable group of drone professionals, inventors, and hackers that work for Aeroterrascan, a drone company from Bandung, on the island of Java. As part of their corporate social responsibility they have donated their time and technologies to the Balinese emergency and crisis response teams. Its been fascinating to participate in a project that flies remote sensing systems high in the air in order to better understand dangerous forces deep in the Earth.

I’ve been involved in two different drone volcano missions. A third mission will begin in a few days. In the first we used drones to create an extremely accurate 3D map of the size of the volcano—down to 20 cm of accuracy. With this information, we could see if the volcano was actually growing in size—key evidence that it is about to blow up.

The second mission involved flying a carbon dioxide and sulphur dioxide smelling sensor through the plume. An increase in these gases can tell us if an eruption looms. There was a high degree of carbon dioxide and that informed the government to raise the threat warning to the highest level.

In the forthcoming third mission, will use drones to see if anyone is still in the exclusion zone so that there eviction can be facilitated.

What is interesting to me as an anthropologist is how scientists and engineers use technologies to better understand distant processes—in the atmosphere, below the earth, from a different other, and at some point in the future. It has been a difficult task, flying a drone 3000 meters to the summit of an erupting volcano. Several different groups have tried and a few expensive drones have been lost—sacrifices to what the Balinese Hindus consider a sacred mountain.

More philosophically, I am interested in better understanding the implications of having sensor systems flying about in the air, swimming under the seas, or positioned volcanic craters—basically everywhere. These tools may help us evacuate people before a crisis, but it also entails transforming organic signals into computer code. We’ve long interpreted nature through technologies that augment our senses, particularly sight. Microscopes, telescopes, and binoculars have been great assets for chemistry, astronomy, and biology. But the sensorification of the elements is something different. This is the computationalisation of Earth, alluded to by Jennifer Gabrys and Benjamin Bratton. We’ve heard a lot about the internet of things but this is the internet of nature. The present proliferation of drones is the latest step in the direction of wiring everything on the planet—in this case the air itself.

These flying sensors, it is hoped, will give Indonesian volcanologists what Stephen Helmreich, thinking through another realm of cosmic possibility, astrobiology, calls abduction, “an argument from the future” (2016: 15) with abductive reasoning being an inference from “premises that may or may not materialize in the future” (2016: 20). Abduction is something, we humans, who have brought upon ourselves and all species longterm planetary calamities for immediate gain—are not very good at. This abductive reasoning requires sensors very much new—only a decade or so old with satellite images but 50 years old—and capable of only partial perspective. We are indeed a species still young in building multigenerational tools, concepts, and their applications.

There is something not fully comprehended—or more ominously not comprehensible—about how flying robots and self-driving cars equipped with remote sensing systems filter the world through big data crunching algorithms capable of generating and responding to their own artificial intelligence. These non-human others react to the world not as ecological, social, or geological processes but as functions and feature-sets in databases. I am concerned with what this software-isation of nature will exclude. As they remake the world in their database images, what will be the implications of those exclusions for planetary sustainability and human autonomy.

In this future world, there may be less of a difference between engineering towards nature and the engineering of nature.