World

Banning Killer Robots is 'No Longer an Option', We Must Now Focus on Control, Argues Futurist

Autonomous killing machines are becoming a growing reality worldwide, one which has sparked intense debate and resulted in an increasing number of governments and civil society organisations calling for a ban over the use of the technology.
Sputnik

There is growing support worldwide for an outright ban over the use of fully autonomous weapons, also known as Killer Robots, according to a new report from Human Rights Watch (HRW). In its latest report, HRW details the publicly known positions of 49 countries on the development and use of Killer Robots following a global campaign to ban the technology, which was first initiated by an international coalition of civil society and other groups, in 2012.

Abishur Prakash is a geopolitical futurist at the Center for Innovating the Future (CIF), which focuses on the future of business and geopolitics. His latest book, The Age of Killer Robots, looks at the risks associated with societies handing power over to algorithms and machines. Mr Prakash tells Sputnik that it is too late to ban automated killing machines and that the focus now needs to be on controling how they work.

Sputnik: What do you think about the push to ban automated killer robots (autonomous weapons), is this really necessary?

Abishur Prakash: This is an outdated approach. The reality is that the key "decision makers," like US, Russia, China, India, Japan, Israel, United Kingdom, are all moving forward with autonomous weapons. For example, later this month, the US will pit an AI-fighter jet against a manned fighter jet in a simulation to see who is better. In Russia, researchers are developing systems that will allow soldiers to give voice commands to robots. The groundwork for killer robots is being laid down right now. Banning is no longer an option. 

Now, the focus has to turn to control. How does the world control the behavior of killer robots? A lot of this I discuss in my new book, "The Age of Killer Robots," where issues like ethics and and ideas like "central brains" are explored in-depth.

Sputnik: How realistic do you think this initiative is and what is your reaction to the fact that 30 countries have called for an outright ban?

Abishur Prakash: I can understand the fear around killer robots that is driving calls for bans. These are weapons that will make decisions without human input. These decisions could lead to wars and conflicts and loss of human life. The fact that 30 countries, including Mexico, Pakistan and Brazil, are backing a ban shows that this fear is shared by governments around the world.

For me, most surprising was that China was one of the 30 countries calling for a ban, considering Beijing is moving at light speed on killer robots. By the 2030s, China wants to have autonomous, armed submarines that can conduct entire missions on their own, including suicide attacks on enemy targets. And, today, China is exporting drones dubbed "Blowfish," that are armed and have AI that allows them to operate autonomously.

This means that some governments are operating along two-fault lines when it comes to killer robots: what they say publicly and what they do privately. 

And, the reality is, regardless of what a nation says, if their adversary has killer robots, they will too. It's a domino effect. If India has killer robots, so will Pakistan and China. If China has killer robots, so will US and parts of Western Europe. If US has killer robots, so will Russia. The list goes on.

No matter what the stance of a nation is on killer robots, geopolitics is going to push them in an entirely different direction.

Sputnik: What is at stake if this push to ban killer robots is unsuccessful?

Abishur Prakash: In short, a new status quo. From aerial drones to submarines to warships to tanks, every aspect of warfare will be managed by AI and robots. This takes humans out of the equation. It means that algorithms, from defense companies or tech firms, will be in charge of who to attack, how and when. And, humans will be dealing with the blowback of decisions that autonomous weapons make. For the first time, technology will be making defense decisions and humans will be playing catch up. 

In this status quo, chaos, volatility and uncertainty becomes the new normal. The world will wake up to a reality where there is uncertainty as to what decisions killer robots will make on a given day. How will investors react to geopolitical uncertainty? How will governments decide where to deploy their military? How will nations build relations in this environment? 

There is no precedent or playbook to fall back on. This is an entirely new era. And, this is why governments must start to develop a new kind of foreign policy and defense strategy that adapts to the new geopolitical future that is emerging.

Sputnik: Any other comments about HRW's latest report?

Abishur Prakash: First, there are no real, concrete proposals as to what should take place if a ban is unsuccessful (or even how to implement a ban). The report echoes several years of consensus between countries that "action" has to be taken around killer robots. Except, what this action is (beyond a ban) remains unclear. This has to change, otherwise governments will start to view initiatives around killer robots as more of an "echo chamber" than a forum where concrete action is taken. This is dangerous. In the future, when real policy has to be passed, such as over the export of killer robots, governments may enter these discussions expecting them to fail.

Second, there remains almost no discussion as to the role that private technology companies are playing (and will increasingly play) in the development of killer robots. Technology companies are the future defense companies (another area that I discuss in my book). They are the ones that may build the brains and bodies for killer robots. They are as much of a stakeholder and voice as countries and yet they are being left out of the conversation. Without them, real, substantive policies cannot be designed, let alone passed.

 

Discuss