Robot rights violate human rights, experts warn EU

Access to the comments Comments
By Alice Cuddy
Robot rights violate human rights, experts warn EU
Copyright  REUTERS/Regis Duvignau

Leading experts in robotics and artificial intelligence have warned the European Commission that plans to grant robots legal status are “nonsensical and non-pragmatic” — and that doing so could breach human rights. 

In an open letter, more than 150 experts in robotics, artificial intelligence, law, medical science and ethics, warned the Commission against approving a proposal that envisions a special legal status of “electronic persons” for the most sophisticated, autonomous robots.

“Creating a legal status of electronic ‘person’ would be ideological and nonsensical and non-pragmatic,” the letter says.

The group said the proposal, which was approved in a resolution by the European Parliament last year, is based on a perception of robots "distorted by science fiction and a few recent sensational press announcements."

“From an ethical and legal perspective, creating a legal personality for a robot is inappropriate”, they argued, explaining that doing so could breach human rights law.

The experts said Europe should create rules for robotics and artificial intelligence that foster innovation, but also consider the “societal, psychological and ethical impacts.”

“The benefit to all humanity should preside over the framework for EU civil law rules in robotics and artificial intelligence,” the letter says.

The European Parliament resolution, which stresses that robots must serve humanity and not be used to cause damage, is part of efforts by the bloc to prepare for major advances in technology.

‘‘Humankind stands on the threshold of an era when ever more sophisticated robots, bots, androids and other manifestations of artificial intelligence seem poised to unleash a new industrial revolution, which is likely to leave no stratum of society untouched, it is vitally important for the legislature to consider all its implications,” it says.