icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
19 Dec, 2015 02:29

Self-driving cars get into accidents because of humans

Self-driving cars get into accidents because of humans

Self-driving cars are programmed to follow the rules of the road, whereas drivers are not. This discrepancy has caused a high accident rate for driverless vehicles, prompting California to demand they have a steering wheel and a human driver at the ready.

Questions over the safety of self-driving cars, a rapidly-developing technology led by the likes of Google, has pushed California to issue state rules for autonomous automobiles that may put the brakes on the driverless revolution. This week, regulators in the Golden State demanded that self-driving vehicles pass multiple safety tests and users of driverless cars must be trained and certified by the manufacturer. In addition, they would receive a special designation on their driver's license.

"Given the potential risks associated with deployment of such a new technology, (California Department of Motor Vehicles) believes that manufacturers need to obtain more experience in testing driverless vehicles on public roads prior to making this technology available to the general public," the agency said of the new regulations.

The federal government, via the US Department of Transportation, has already issued unofficial guidance on self-driving cars, also requiring that a human be ready to take the wheel of a self-driving car should the moment arise.

The cautious approach taken by government regulators towards autonomous vehicles comes amid research showing self-driving cars are not ready for everyday use on US roads for a variety of reasons that include hacking and fuel usage. Furthermore, though rarely at fault, self-driving cars were found to be at least twice as likely to get into an accident on public roadways than regular cars, according to a recent study by the University of Michigan's Transportation Research Institute.

In nearly every case analyzed in the study, driverless cars were not to blame for collisions with conventional cars. Yet, paradoxically, self-driving cars could be considered the reason for the accidents given they are programmed to drive conservatively and strictly follow all traffic laws. The reality of driving often means human operators must bend the rules to avoid accidents or maintain the flow of traffic. Self-driving cars are not designed to push boundaries, which pose problems demonstrated by the collision data reported in the study.

“It’s a constant debate inside our group,” Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab, told Bloomberg, describing his group's deliberations over how self-driving cars should be programmed. “And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people.”

Rajkumar said this conundrum played out when his group offered test drives for members of Congress amid Washington, DC traffic. A self-driving Cadillac SRX had to merge into busy highway traffic, which meant crossing three busy lanes within 150 yards. The car had difficulty maneuvering in the situation, unsure if it could trust drivers to let is cross. A human operator had to take control to complete the merge.

“We end up being cautious,” Rajkumar said. “We don’t want to get into an accident because that would be front-page news. People expect more of autonomous cars.”

Meanwhile, given the state's size and influence, California's new rules may put a major pause on the timetable for public release of self-driving cars. The state said it wants to protect consumer safety while not stifling technological development that, ultimately, could save lives on the road.

“The primary focus of the deployment regulations is the safety of autonomous vehicles and the safety of the public who will share the road with these vehicles,” said California DMV Director Jean Shiomoto. “We want to get public input on these draft regulations before we initiate the formal regulatory rule making process.”

READ MORE: 1.4mn vehicles recalled over remote hack vulnerability

Key aspects of the state's initial guidance demand that manufacturers be certified for "specific autonomous vehicle safety and performance requirements;" all self-driving cars must be verified for safety by both the manufacturer and an independent certifier; operators must be certified and licensed to take control of an autonomous vehicle; manufacturers must earn a permit for three years "which will require them to regularly report on the performance, safety, and usage of autonomous vehicles;" manufacturers must disclose to operators what information they are collecting outside of safety data; and cars must "be equipped with self-diagnostic capabilities that detect and respond to cyber-attacks or other unauthorized intrusions."

The rules were crafted after more than a year of consideration. They will require a lengthy review period before final regulations are set in 2016, the Associated Press reported.

Google's Chris Urmson, an official with the company's self-driving-car operations, slammed the new proposal for being a step backwards that will likely inhibit innovation, particularly the state's requirement that self-driving cars always contain a licensed driver.

"This maintains the same old status quo and falls short on allowing this technology to reach its full potential, while excluding those who need to get around but cannot drive," Urmson wrote for Medium. "While we’re disappointed by this, we will continue to work with the DMV as they seek feedback in the coming months, in the hope that we can recapture the original spirit of the bill."

California Lieutenant Governor Gavin Newsom echoed these sentiments in a statement, saying the draft rules "may prove too onerous, create road blocks to innovation, and may ultimately drive the development of this promising industry to other states."

Google did not comment to AP about the state's new rules, but it has generally pushed for quicker public adoption of self-driving cars. Ron Medford, chief of Google's self-driving car operations – which is planned to be a standalone business within the Alphabet Inc. family in 2016 – has said "the bug" in self-driving cars is the human operator and that human control should be as limited as possible. Medford said in September that Google's technology is "close to working pretty damn well."

Google said its self-driving cars currently operate cautiously, yet they want them to become more "aggressive" like human drivers in order to "naturally fit into the traffic flow, and other people understand what we’re doing and why we’re doing it," Dmitri Dolgov, top engineer of Google's self-driving car operations, told Bloomberg.

The company's autonomous cars have been involved in 17 accidents in the 2 million miles they have driven, according to the University of Michigan study. Accidents involving driverless cars in California must be reported to the authorities.

Podcasts
0:00
28:20
0:00
27:33