If robots are going to steal human jobs and otherwise disrupt society, they should at the very least pay taxes.
That’s the takeaway from a draft report on robotics produced by the European Parliament, which warns that artificial intelligence and increased automation present legal and ethical challenges that could have dire consequences.
“Within the space of a few decades [artificial intelligence] could surpass human intellectual capacity in a manner which, if not prepared for, could pose a challenge to humanity’s capacity to control its own creation and … the survival of the species,” the draft states.
The report offers a series of recommendations to prepare Europe for this advanced breed of robot, which it says now “seem poised to unleash a new industrial revolution.”
The proposal suggests that robots should have to register with authorities, and says laws should be written to hold machines liable for damage they cause, such as loss of jobs. Contact between humans and robots should be regulated, with a special emphasis “given to human safety, privacy, integrity, dignity and autonomy.”
If advanced robots start replacing human workers in large numbers, the report recommends the European Commission force their owners to pay taxes or contribute to social security. The establishment of a basic income, or guaranteed welfare program, is also suggested as a protection against human unemployment.
Should robots ever become self-aware, the report suggests that the moral code outlined by science fiction writer Isaac Asimov be observed. Asimov’s laws stipulate that a robot must never harm a human and always obey orders from its creator.
The draft report, which was written by Mady Delvaux, a member of the European Parliament from Luxembourg, could go before the full European Parliament for a vote later this year. Its approval would be largely symbolic, however, since EU legislation must originate with the European Commission. The Commission did not respond to a request for comment on Wednesday.
In April, the European Parliament’s legal affairs committee held a hearing to discuss the issue.
“Can a robot express intention? I think the answer is very simple when it comes to noncomplex algorithms, but when it gets more complex, I think we have a problem,” Pawel Kwiatkowski of Gessel Law Firm said during the hearing.