MOSCOW, (Sputnik), Anastasia Levchenko – Lethal Autonomous Weapons Systems (LAWS), also known as "killer robots," pose a threat to world security mainly due to the lack of human control that is in violation of international requirements, a human rights watchdog told Sputnik on Wednesday.
"It is highly unlikely that killer robots would be able to comply with the basic requirements of international humanitarian law or international human rights law, many of which demand human judgment, not algorithms. Thus, they would pose grave dangers to civilian populations," Steve Goose, Director of the Arms Division at the Human Rights Watch (HRW) and Co-founder of the Campaign to Stop Killer Robots said.
According to Goose, killer robots could spread rapidly around the world and thus get into the hands of human rights abusers, destabilizing international security and setting off a robotic arms race.
"By taking the human soldier off the battlefield, they could increase the likelihood of armed attacks and of war, and shift the burden of conflict to civilians," Goose said.
"Not a single state this week has said that it is pursuing fully autonomous weapons, or that its military has a definite need for them. Some states, though, have said that the option should be left open to acquire such weapons, notably Israel and the United States," Goose told Sputnik commenting on the Geneva conference seeking to work out the technical and legal aspects of LAWS production.
Most of the participants in the conference believe that "killer robots" are inevitable as it would be impossible to prevent the emerging trend toward arms autonomy from leading to the development of fully autonomous weaponry, according to Goose.
"And if even one nation acquires them, others will feel compelled to follow suit," the expert elaborated, stressing, however, that although LAWS is capable of outperforming humans in terms of speed, accuracy, endurance and other traits, it is not a sufficient enough reason to promote their spread.
"…most of those advantages would also be true of systems with some degree of autonomy, but where there are still humans in the loop, making the key determinations about what to target and when to fire, rather than delegating those functions to machines. Moreover, the many negative aspects of LAWS outweigh any potential advantages," Goose explained.
LAWS are defined as weapon systems designed and built to select and fire upon targets without human intervention.
The potential development of fully autonomous weapons has raised concerns over the incompatibility of such new technologies with the human ability to value life.
The Campaign to Stop Killer Robots seeks to ban or restrict the use of autonomous weapons over fears they could lead to unjustifiable injury and deaths.