The European Parliament’s Legal Affairs Committee has urged the EU Commission to put forward rules for the fast-evolving field of robotics, to settle issues such as compliance with ethical standards, and liability for accidents involving driverless cars.
The MEPs’ report looks at robotics-related issues, such as liability, safety, and changes in the labor market. The Committee stresses that EU-wide rules are needed to fully exploit the economic potential of robotics and artificial intelligence (AI), and guarantee a standard level of safety and security. The report argues that the EU needs to take the lead on regulatory standards, so as not to be forced to follow those set by third states. MEPs urge the Commission to consider creating a new European agency for robotics and artificial intelligence to supply public authorities with technical, ethical and regulatory expertise. They also propose a voluntary ethical conduct code to regulate who would be accountable for the social, environmental, and human health impacts of robotics. The proposed Code of Ethical Conduct should recommend that designers include ‘kill’ switches, so that robots can be turned off in emergencies.
MEPs note that harmonized rules are especially urgently needed for self-driving cars. They call for an obligatory insurance scheme, and a fund to ensure that victims are fully compensated in cases of accidents caused by driverless cars. Section 15 of the report considers that the automotive sector is in most urgent need of European and global rules to ensure the cross-border development of automated vehicles, so as to fully exploit their economic potential and benefit from the positive effects of technological trends. The report emphasizes that fragmented regulatory approaches would hinder implementation and jeopardize European competitiveness. It also notes that although current private international law rules on traffic accidents applicable within the EU do not need urgent modification to accommodate the development of autonomous vehicles, simplifying the current dual system for defining applicable law would improve legal certainty.
The Committee also says that in the long-term, the possibility of creating a specific legal status of ‘electronic persons’ for the most sophisticated autonomous robots, so as to clarify responsibility in cases of damage. The report suggests Issac Asimov’s ‘Three Laws of Robotics’ as a general principle that designers and producers of robotics should abide by. Devised by the science fiction author in his 1942 short story ‘Runaround’, the Laws are:
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm;
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law;
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
Rapporteur Mady Delvaux (Luxembourg), said, “A growing number of areas of our daily lives are increasingly affected by robotics. In order to address this reality and to ensure that robots are and will remain in the service of humans, we urgently need to create a robust European legal framework.”