Users of autonomous cars should not be legally responsible for road safety, a legal watchdog in the UK has proposed.
They should be classified as "users-in-charge" rather than drivers and would be exempt from responsibility for infringements such as dangerous driving, exceeding the speed limit, or running a red light. Instead, the carmakers would be liable in these cases.
However, the user-in-charge would still have responsibility for carrying insurance, checking loads, or ensuring that children wear seat belts.
The new guidelines have been proposed by the Law Commission of England and Wales, and the Scottish Law Commission and were outlined in their report released on Wednesday.
"The development of self-driving vehicles in the UK has the potential to revolutionise travel, making everyday journeys safer, easier and greener," said junior transport minister Trudy Harrison.
"However, we must ensure we have the right regulations in place, based upon safety and accountability, in order to build public confidence.
"That’s why the Department funded this independent report and I look forward to fully considering the recommendations and responding in due course".
Scottish Law Commissioner David Bartos said the new laws ensure "safety and accountability while encouraging innovation and development".
Confusion over the definitions of self-driving and driver-assisted
The commission also said that there should be a clear distinction between self-driving and driver-assisted vehicles.
"The distinction between driver assistance and self-driving is crucial. Yet many drivers are currently confused about where the boundary lies. This can be dangerous," it wrote in the summary of the report.
The commission continued: "This problem is aggravated if marketing gives drivers the misleading impression that they do not need to monitor the road while driving - even though the technology is not good enough to be self-driving".
Electric vehicle maker Tesla, which has been the focus of criticism, says it sells a "full self-driving feature" - a beta version of an advanced driver-assist system that is currently being tested by hundreds of Tesla users - but human supervision is always needed.
Recent Tesla controversies
Tesla has also been involved in several controversial episodes recently. It is currently under investigation by the US National Highway Traffic Safety Administration (NHTSA) after a series of crashes involving its cars using the Autopilot system and stationary emergency services vehicles in the country.
California prosecutors also filed two counts of vehicular manslaughter last week against the driver of a Tesla on Autopilot who ran a red light, slammed into another car, and killed two people in 2019.
It is thought to be the first person charged with a felony in the US for a fatal crash involving a motorist who was using a partially automated driving system.
The Autopilot system is widely available to Tesla drivers and is different from the full self-driving beta system, which is currently only being tested on a few hundred Tesla users.
The UK, Scottish and Welsh governments will now decide whether to accept the commission’s recommendations and introduce legislation to bring them into effect.