New reports reveal self-driving cars struggling with software, highway overpasses, and even the sun

A group of "disengagement reports" from eight companies testing autonomous vehicles in California uncover many different reasons why humans are still needed to grab the wheel.

General Motors vehicles overshooting stop signs. A Waymo self-driving car nearly sideswiping another vehicle while making a left turn. Nissan’s on-board operating system crashing and rebooting while in motion.

Those are just a few of the examples of the supplemental reports filed to the California Department of Motor Vehicles by a handful of companies developing autonomous vehicles. In those instances, human drivers were able to take control of the cars before collisions with other vehicles could occur, but the reports reflect just how far self-driving technology still has to go before it’s completely road-safe. Of the roughly 50 companies testing self-driving cars in the state, California asked eight to file supplements to their 2017 “disengagement reports” — documents in which the firms are required to disclose incidents in which their cars’ autonomous functions stop and a real driver takes the wheel.

The reports, which were first reported Tuesday by the San-Jose Mercury News, reveal several nagging problems that occur across autonomous-vehicle developers. Several companies’ cars recorded difficulties making turns, software glitches, and inabilities to identify stop signs and traffic signals.

The companies that filed supplemental reports include Baidu, Aptiv,, GM Cruise, Nissan, Telenav, Waymo, and Zoox. In its new filing, Cruise admitted its cars sometimes failed to take into account the movements of cars in other lanes. But Cruise also said its autonomous cars sometimes failed because they could not anticipate errant driving by humans using the roads. Aptiv, an auto-parts maker formerly known as Delphi, also blamed some of its shortfalls on bad human operators. “For instance,” it wrote in its supplemental report, “another vehicle may have driven in the wrong direction down a one-way street.”


Nissan, which is adapting its own vehicle models for self-driving operation, reported that new lines of code being added to its cars’ operating systems sometimes lead to a “software crash.” When that happened, the company said, the human inside the car takes over while the system reboots. Waymo, a division of Google parent company Alphabet, also reported having software bugaboos. 

“There are thousands of checks on the vehicle continually running,” Waymo’s report stated. “If such diagnostics cause an error message the driver would get a message to take control of the vehicle.”

And Telenav, a satellite-navigation company that’s developing self-driving systems, reported a slew of problems. Its cars had trouble observing the “three-second rule” — the notion that a car should leave a gap behind the vehicle ahead of it at least three times the distance it travels per second. In one instance, a Telenav vehicle entered a busy highway traveling at about 63 miles per hour, indicating it should’ve left about 276 feet of space behind the leading vehicle. But when the cars were about 295 feet apart, the human riding Telenav’s car noticed it wasn’t braking and would soon be following much closer than it should have been. (The operator was able to take control and tap the brakes.)

Telenav’s cars also experienced difficulty in measuring distances in parking garages and recorded incidents in which its cars stopped suddenly after mistaking overpasses for idling vehicles.

The supplemental reports come at a time when many states are reconsidering how much autonomous vehicle testing they allow on public roads. Arizona suspended road testing in March after one of Uber’s self-driving cars fatally struck a pedestrian crossing a street in Tempe. California, though, recently revised its regulations to allow more types of self-driving cars to be tested, though some of those  — such as ones that do not have any steering wheel or pedals installed — are still theoretical.


Still, the eight companies’ reports show that even in their current level of development, self-driving cars still have ways to go before they’ve mastered environments in which they have to deal with traffic, signs and signals, bikes and pedestrians and, in at least one case, the sun. Aptiv, in its supplemental filing, reported at least one instance of a car disengaging because of a “poor sun condition” — when the sun’s position in the sky oversaturated the car’s imaging system and caused it to misread the color of a traffic light.

Latest Podcasts