It’s report card time for the automakers and Silicon Valley denizens studying the tricky problem of making cars drive themselves, and everyone is passing.
The California DMV just released its annual slate of “disengagement reports,” documents provided by the 11 companies that receivedstatepermits to test autonomous vehicles by the end of 2015. The results, summarized below, reveal how oftenhumans had to wrest control away from the computer, and why (sort of).
Although the reports are an imperfect measure of how the technology performs, they do reveal rapid progress toward the day when you are no longer needed behind the wheel. Google and General Motors are leading the class with cars capable of driving hundreds of miles at a stretch without trouble. But even those who don’t make the honor roll showimpressive gains. Nissan’s robocars, for example, needed human intervention once every 247 miles, compared to once every 14 miles in 2015.
The reports, which cover December 2015 to November 2016, leave a lot to be desired—more on that in a moment—but do offer interesting insights. Google’s program, now called Waymo and gearing up for commercial applications, continues outpacing the competition. The company’s carsdrove 636,000 miles with just 124 disengagements, a 19 percent drop from 2015. Its fleet logged nearly all those miles on the quiet, suburban streets of Mountain View and its environs, and most of the interventions followed hardware or software discrepancies, when, say, the car’s lidar and camera reported slightly different data.
Cruise, the startup leading GM’s autonomous driving efforts, did all its testing inSan Francisco, where it ramped up from five miles in June 2015, to 400 in June 2016. By late last year, it was clocking hundreds of miles without a hitch.
Most of Delphi’s trouble came while changing lanes in heavy traffic. Ford’s two autonomous cars in California only drive on the highway, during the day, with fine weather and road conditions, which explains why it only needed human help three times in 590 miles. (It has a larger test fleet in Michigan, which doesn’t require any reporting.) The little testing Tesla reported (just 550 miles) was part of its preparation to launch a revamped Autopilot system. It gets most of its data from its customers, driving in the real world.
Like the state’s requirement that companies publicly report any crashes involving their robo-rides, the point of these reports is to createaccountability for the new technology. In a field reliant upon complexsoftware, artificial intelligence, and sophisticated hardware, disengagements provide a metric the average person can understand. They reveal how often the car screws up so badly that the human inside had to take over.
But, like crash reports (which mostly reveal that people cannot stop rear-ending Google cars), they don’t tell the whole story.
“Disengagements are not a scientific measure of the complexity and operating characteristics of these vehicles,” says Bryan Reimer, who studies autonomous driving at MIT. “They’re just one very interesting data point.”
What makes them interesting is they provide a picture of how each autonomous program is developing. Companies fess up to how many miles they’ve covered on public roads, and a careful reader can extrapolate general info about where each company is testing, and in what conditions.
They’re unscientific because each disengagement involves all sorts of variables, which the reports log inconsistently. They don’t reveal the impact of weather, or where exactly these “problems” occurred. They don’t note if the cars are followingdetailed maps, or exploring an area for the first time. They don’t account for the proclivities of human operators, who likely have different thresholds for when they’ll take over. (Blame California law, which requires that the report include certain details but doesn’t specify how they should be presented, or in what context.)
Automation experts warn that these shortcomings, and the fact individual reports offer a unique blend of information, means you can’t fairly use them to compare one company’s progressagainst another. A spike in disengagements might indicatea major problem—or reveal a company is experimenting in more challenging situations.
Disengagements are not a scientific measure of the complexity and operating characteristics of these vehicles.Bryan Reimer, MIT
These reports would more helpful if they were more rigorous, says Raj Rajkumar, who studies autonomous technology at Carnegie Mellon University. Qualitative comments would add context. For example, an automaker could explain its engagementsare up because it is exploring city driving for the first time. Even consistent formatting and defined terms would help. The11 reports come in 11 formats, and teemwith vague language like “technology evaluation management” (Mercedes-Benz), “follower output invalid” (Tesla), and “planned test to address planning” (Cruise/General Motors).
However imperfect, the reports arebetter than nothing, especially because other states with self-driving regulations—including Nevada, Michigan, and Florida—don’t demand any public disclosures. (That’s why no oneknows much about Uber’s testing.) California’s law may be flawed, but it keeps the public, and public officials, in the loop, and should help build trust as these systems move toward commercial use.
Plus, these are early days, says Huey Peng, who directs the University of Michigan’s Mobility Transformation Center. California could update its requirements, and other states following its lead could improve on what it has started. Until then, here’s a look at the
hard squidgy numbers.