icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
11 Feb, 2016 05:11

Google car AI qualifies as a ‘driver,’ US regulator says

Google car AI qualifies as a ‘driver,’ US regulator says

A US traffic regulator has said that the artificial intelligence controlling Alphabet Inc’s Google self-piloted car can be considered a driver just like a human.

In a recently revealed letter, the National Highway Traffic Safety Administration (NHTSA) stated that it “will interpret ‘driver’ in the context of Google’s described motor vehicle design as referring to the SDS [self-driving system] and not to any of the vehicle occupants.”

The acknowledgement is a big boost for getting the SDSs on the road. The director of Google’s self-driving car project said the agency’s decision “will have major impact” on its development, according to a November letter reviewed by Reuters on Wednesday.

This statement however is not an official announcement and it does not mean that Google cars are going to be driving around freely anytime in the near future. NHTSA warns that Google might face many other problems in relation to existing regulations.

“NHTSA would need to commence a rulemaking to consider how FMVSS [Federal Motor Vehicle Safety Standards] No. 135 might be amended in response to ‘changed circumstances’ in order to ensure that automated vehicle designs like Google’s… have a way to comply with the standard,” the letter goes.

Another obstacle the company’s car will have to face is technical issues, such as the safety of passengers on the road and preventing the AI system from being hacked. The company’s autonomous cars have been involved in 17 accidents in the 2 million miles they have driven, a University of Michigan study has established.

One of the issues with self-driving vehicles is that they are programmed to follow the rules of the road, while human drivers are often more careless. The discrepancy is said to be behind a high accident rate for driverless vehicles.

Last year’s study by the University of Michigan’s Transportation Research Institute revealed that self-driving cars were found to be at least twice as likely to get into an accident on public roadways as regular cars.

Yet, in nearly every case considered in the study, driverless cars were not to blame for collisions with conventional cars.

Google said its self-driving cars currently operate cautiously, but they want them to become more “aggressive” like human drivers in order to “naturally fit into the traffic flow, and other people understand what we’re doing and why we’re doing it,” Dmitry Dolgov, top engineer of Google’s self-driving car operations, told Bloomberg.

Previously, the California Department of Motor Vehicles had issued a draft of regulations for self-driving vehicles, which Google criticized as a setback for the technology. Besides containing requirements that multiple safety tests be passed, the December 2015 proposed plan specifically included the need for a human driver with an “autonomous vehicle operator certificate,” who would be responsible for accidents or traffic law violations. It also mentioned a stipulation requiring that a conventional steering wheel and pedals be installed, which are redundant in autonomous vehicles.

READ MORE: Who do I send the ticket to? Cops pull over Google self-driving car 

In one incident last year, a Google self-driving car was pulled over by police near the company’s Mountain View, California headquarters because it was driving too slowly.

“As the officer approached the slow moving car he realized it was a Google Autonomous Vehicle,” the MVPD said in a statement. “The officer stopped the car and made contact with the operators to learn more about how the car was choosing speeds along certain roadways and to educate the operators about impeding traffic per 22400(a) of the California Vehicle Code.”

Ninety-four percent of all accidents are caused by human error and Google believes that self-driving vehicles can significantly reduce that number.

Podcasts
0:00
23:13
0:00
25:0