Research and Articles
Hotline
- Capital Markets Hotline
- Companies Act Series
- Climate Change Related Legal Issues
- Competition Law Hotline
- Corpsec Hotline
- Court Corner
- Cross Examination
- Deal Destination
- Debt Funding in India Series
- Dispute Resolution Hotline
- Education Sector Hotline
- FEMA Hotline
- Financial Service Update
- Food & Beverages Hotline
- Funds Hotline
- Gaming Law Wrap
- GIFT City Express
- Green Hotline
- HR Law Hotline
- iCe Hotline
- Insolvency and Bankruptcy Hotline
- International Trade Hotlines
- Investment Funds: Monthly Digest
- IP Hotline
- IP Lab
- Legal Update
- Lit Corner
- M&A Disputes Series
- M&A Hotline
- M&A Interactive
- Media Hotline
- New Publication
- Other Hotline
- Pharma & Healthcare Update
- Press Release
- Private Client Wrap
- Private Debt Hotline
- Private Equity Corner
- Real Estate Update
- Realty Check
- Regulatory Digest
- Regulatory Hotline
- Renewable Corner
- SEZ Hotline
- Social Sector Hotline
- Tax Hotline
- Technology & Tax Series
- Technology Law Analysis
- Telecom Hotline
- The Startups Series
- White Collar and Investigations Practice
- Yes, Governance Matters.
- Japan Desk ジャパンデスク
Technology Law Analysis
April 11, 2018Are we prepared for the driverless future yet?
This article was originally published in the 10th April 2018 edition of
In the wake of the Tempe car crash, road-safety advocates have now called for rules around autonomous vehicles to be tightened rather than loosened.
An autonomous self-driving Uber vehicle recently failed to avoid hitting a 49-year-old woman on a street in Tempe, Arizona, in a first of its kind incident. The fatality comes at a critical juncture for the nascent industry, causing some to question the pace at which the technology is advancing.
Policymakers, law enforcement agencies and judiciary around the world need to brainstorm on how to best regulate and assign liability in such scenarios, as such incidents raise novel and complex issues without legal precedence. Germany has already attempted to clarify rules by enacting a set of regulations in May 2017 requiring a person to remain present in an autonomous vehicle at all times and assigning liability on the manufacturer in case of an accident on autonomous mode.
AI & ETHICS
Autonomous vehicles will no doubt drive themselves, and do much more. They may accurately arrive at your house, ready for your morning commute. They may also detect which groceries you require (from your fridge—or your glucose monitor), and autonomously go fetch some unless you have already subscribed to a quicker drone- enabled grocery delivery system.
The attempts to fully automate such an otherwise lethal technology have given not only inventors but also regulators, academics and journalists much to ponder. By far, the question receiving the most prominent discussion is the “trolley problem”, a longstanding ethical paradox. This is based on an ethical conundrum in a situation where a runaway trolley is hurtling down a railway track towards five people who are tied up and unable to move. You have the option of pulling a lever which diverts the trolley towards a single individual tied to another track, killing him instead of the original five people. The problem then becomes about choice. Do nothing, and the trolley kills five people on the main track. Pull the lever, diverting the trolley on to the side track where it will kill that one person. This problem gets to the heart of some of the oldest debates in moral philosophy.
Machines however do not introspect. A self-driving vehicle executes a “decision” in milliseconds. No ethical considerations are in play. Rather, the decision would result from a set of pre-existing preferences installed by coders. Policymakers will need to think about the road-based equivalent of this track-based trolley problem, deciding whether to and how to code “societal values” into autonomous vehicles.
In August 2017, the German government made it illegal to programme an autonomous vehicle with demographic preferences when faced with the prospect of causing injury. It can only take actions to do least harm to people, and humans take precedence over property. The idea here is to take out the issue of choice and ethical dilemma altogether by having the vehicles mathematically decide which decision would cause least human damage.
ARE DRIVERLESS CARS SAFER?
The fundamental reason for autonomous vehicles having the backing of all major industry players is that it holds the key to a better, safer and cleaner transportation ecosystem and a better human experience. While, statistically, driverless cars appear to be safer than normal cars, one must keep in mind that crash statistics for human- driven cars are compiled from all sorts of driving situations, and on all types of roads and weather conditions. However, much of the data on self-driving cars’ safety comes from only the sunny western states of the US, recorded on unidirectional, multi-lane highways. With time, data on fully automated systems will naturally expand to cover more roads, terrains and geographies. Until such time, statistics on autonomous vehicles will need to be taken with a pinch of salt.
Many cities and states in the US permit the testing of autonomous vehicles on public roads, with varying degrees of licensing and regulation. Boston, for example, requires such vehicles to pass a driving test in a limited area before heading out into the wider city. California requires companies testing autonomous vehicles to provide annual safety reports. Arizona was a particularly attractive environment for autonomous vehicle makers as its streets are in regular grids, the weather is reliably dry and warm, and its regulators have been unusually welcoming. There is legislation, currently in the works in Washington, DC, proposing to exempt autonomous vehicles from certain existing safety standards. However, in the wake of the Tempe car crash, road-safety advocates have now called for rules around autonomous vehicles to be tightened rather than loosened.
THE ROAD AHEAD
The extent to which the Tempe car crash will change attitudes towards autonomous vehicles, or influence the regulation of the industry, depends to a large extent on the culmination of various investigation reports. Having said that, while it is important that regulations do not discourage or become an obstacle in the path of technological advancement, such advancement should not be at the cost of public safety. It is true that self-driving cars don’t get tired, angry, frustrated or drunk, but neither can they react to uncertain and ambiguous situations with the same skill or anticipation of an attentive and experienced human driver. This suggests that perhaps the two still need to work together until the technology is rendered seamless and foolproof?