The report by Human Rights Watch demands that a human must always have final control over machines that make life and death decisions to ensure moral and legal responsibility remains.
Even the most skilled artificial intelligence would be unable to assess the ethical implications of killing a wounded combatant or weigh the risk of civilian deaths against a military objective, “the report argues”.
Furthermore, mistakes could not be punished in the courts as a robot would not be subject to trial.
The warning, issued to representatives of countries taking part in the Convention on Conventional weapons this month, comes at a time when the issue of technology is highly controversial. Advances in civilian applications such as self-driving cars have been mirrored by enhanced battlefield capabilities for drones and other robot weapons systems.
At least 27 states support the use of autonomous fighting machines, which could be more precise and consistent that those controlled by humans. On the other hand, nine countries have demanded such technology be banned.
The report highlights the risk of handing over the decision-making process to computers.“After releasing such weapons, humans cannot control where they go or whom they kill. This lack of control can lead to unintended victims, which underlies many of the objections to biological and chemical weapons,” it says.
Concern about weapons systems that are uncontrolled by people is not new: in the early 1900’s a disarmament law under The Hague Convention prohibited the use of mines that could explode when touched by a passing ship.
For further information click here