The problem, says Bigelow, is that it's apples and oranges: "One company's low number of disengagements may occur during testing on empty highways, while another company's high number may have occurred during testing in busy urban areas."
Also, comparisons are misleading "because some companies place more value on testing in real-world scenarios while others put more emphasis on simulation, and sometimes engineers might be purposely disengaging to validate their systems."
Here's a look at the latest disengagement reports, which run from 2016 to late 2017. There are 20 companies listed, though some did not offer any new data. Honda, for example, did not test any vehicles on California roads in 2017, so obviously it had no problems to report.
The reports are difficult to parse for several reasons. Different companies are testing different models of autonomous vehicles and over different time periods. So it's hard to compare one outfit with another. Baidu, for example, tested four models. It had a total of 48 disengagements during the 1,971 miles those four cars drove between October 2016 and November 2017.
But compare that with, say, Alphabet's Waymo, which reported 63 incidents over the same time period. While Baidu's cars logged 1,971 miles, Waymo's vehicles covered a whopping 352,545 miles in autonomous mode. How in the world can you accurately compare these two operators' performance?
For the casual viewer, the reports are more interesting, perhaps, when you look instead at the reasons behind the disengagements. These details offer an intriguing look at this cutting-edge technology – and what sorts of challenges the engineers behind it are facing.
Consider these recurring issues experienced by many of these companies' vehicles:
--Disengage for a recklessly behaving road user
--Disengage for hardware discrepancy
--Disengage for unwanted maneuver of the vehicle