The News
Saturday 05 of October 2024

U.S. Updates Self-Driving Car Guidelines 


In this May 13, 2015 file photo, Google's self-driving Lexus car drives along street during a demonstration at Google campus in Mountain View, California,photo: AP/Tony Avelar, File
In this May 13, 2015 file photo, Google's self-driving Lexus car drives along street during a demonstration at Google campus in Mountain View, California,photo: AP/Tony Avelar, File
The new guidelines encourage companies to have processes in place for broad safety goals, such as making sure drivers are paying attention while using advanced assist systems

ANN ARBOR – The Trump administration is updating safety guidelines for self-driving cars in an attempt to clear barriers for automakers and tech companies who want to get test vehicles on the road.

U.S. Transportation Secretary Elaine Chao announced the new voluntary guidelines Tuesday during a visit to an autonomous vehicle testing facility at the University of Michigan.

The new guidelines update policies issued last fall by the Obama administration, which were also largely voluntary. Under Obama, automakers were asked to follow a 15-point safety assessment before putting test vehicles on the road. The new guidelines reduce that to a 12-point voluntary assessment, and no longer require automakers to consider ethical or privacy issues.

The guidelines also make clear that the federal government — not states — determine whether autonomous vehicles are safe. That is the same guidance the Obama administration gave.


Chao emphasized that the guidelines aren’t meant to force automakers to use certain technology or meet stringent requirements; instead, they’re designed to clarify what autonomous vehicle developers should be considering before they put test cars on the road.

“This is a guidance document,” Chao said. “We want to make sure those who are involved understand how important safety is. We also want to ensure that the innovation and the creativity of our country remain.”

But critics say the voluntary nature of the guidelines gives the government no authority to prevent dangerous experimental vehicles.

“This isn’t a vision for safety,” said John M. Simpson, head of privacy for a nonprofit progressive group called Consumer Watchdog. “It’s a roadmap that allows manufacturers to do whatever they want, wherever and whenever they want, turning our roads into private laboratories for robot cars with no regard for our safety.”

Regulators and lawmakers have been struggling to keep up with the pace of self-driving technology. They are wary of burdening automakers and tech companies with regulations that would slow innovation, but they need to ensure that the vehicles are safely deployed. There are no fully self-driving vehicles for sale, but autonomous cars with backup drivers are being tested in numerous states, including California, Nevada and Pennsylvania.


Autonomous vehicle developers, including automakers and tech companies like Google and Uber, say autonomous vehicles could dramatically reduce crashes but complain that the patchwork of state laws passed in recent years could hamper their deployment. Early estimates indicate there were more than 40,000 traffic fatalities in the U.S. last year; government says 94 percent of crashes involve human error.

But safety advocates say that experimental cars could get on public roads too soon, and accidents could undermine public acceptance of the technology.

The new guidelines encourage companies to have processes in place for broad safety goals, such as making sure drivers are paying attention while using advanced assist systems. The systems are expected to detect and respond to people and objects both in and out of its travel path “including pedestrians, bicyclists, animals, and objects that could affect safe operation of the vehicle,” the guidelines say.

Chao said the guidelines will be updated again next year.

“The technology in this field is accelerating at a much faster pace than I think many people expected,” she said. “We want to make sure stakeholders who are developing this have the best information.”

Chao’s appearance came at a time of increased government focus on highly automated cars.


Earlier Tuesday, the National Transportation Safety Board was debating whether Tesla Inc.’s partially self-driving Autopilot system shared the blame for the 2016 death of a driver in Florida. The board ultimately said the driver’s inattention and a truck driver who made a left-hand turn in front of the Tesla were at fault for the crash, but it said automakers should incorporate safeguards that limit the use of automated vehicle control systems so drivers don’t rely on them too much.

Last week, the U.S. House voted to give the federal government the authority to exempt automakers from safety standards that don’t apply to the technology. If a company can prove it can make a safe vehicle with no steering wheel, for example, the federal government could approve that. The bill permits the deployment of up to 25,000 vehicles in its first year and 100,000 annually after that.

The Senate is now considering a similar bill.

DEE-ANN DURBIN