The international law firm that is playing a significant role in the UK Autodrive consortium, the largest of three consortia launched to support the introduction of self-driving vehicles into the UK, has published new research that concludes driverless vehicles will need to be programmed with a clear and agreed set of rules for decision-making.
In its report on The Moral Algorithm, the Gowling WLG legal practice finds that concerns over the so-called ‘trolley problem’, where a vehicle must choose between hitting defined individuals, may have been exaggerated, with most of the experts interviewed agreeing that autonomous vehicles (AVs) will never be programmed to make such distinctions.
Nevertheless, the paper argues that harmonized safety regulations will be needed for other decisions, such as when it is permissible for a car to break the rules of the road, or when determining the ‘assertiveness’ of a vehicle when it interacts with other road-users.
The report concludes with a series of eight recommendations, including: the creation of an independent regulator to balance the legality, safety, and commerciality, issues surrounding autonomous vehicles; the development of a policy regarding how the ‘moral algorithm’ will operate in terms of major safety situations; and a program of public education and consultation. The ‘Moral Algorithm’ study took the form of interviews with industry specialists and representatives from the UK Autodrive consortium during September and October 2016, as well as desktop research and analysis of publicly-available information.
Commenting on the outcome of the research, Stuart Young, a partner at Gowling WLG and leader of the firm’s automotive sector team, said, “It is important not to equate regulation with a burden. It can, in fact, facilitate new markets and important developments. Unless completely new legislation that accommodates new products in advance of them being produced is implemented, this is likely to impose huge additional risks on the companies producing them, as a result of regulatory uncertainty.”
Speaking about the dilemmas that could be posed once cars are required to make complex decisions, Tim Armitage, Arup’s UK Autodrive project director, said, “As with any complex new technology, AVs cannot be specifically programmed to respond to every possible scenario. This simply isn’t practical when a machine is expected to interact with humans, in a complex environment, on a day-to-day basis.
“AVs will drive to the speed limits and will not be distracted from the task of safe driving; they will make practical decisions based on their programming, but they cannot be expected to make moral decisions around which society provides no agreed guidance. To allow AVs to demonstrate their capacity for practical decision-making in complex environments, and to begin to establish public trust through contact, the first step is allowing testing in relatively simple and well-defined environments. Of course, regulation will need to keep up, so in echoing Stuart’s sentiments, it is vital the legal industry act now in order to help create a realistic and viable route to market for AVs.”