One of my favorite topics, and technologies, in today’s wireless world is autonomous vehicles. If one listens to Elon Musk, Tesla Motor Inc.’s CEO (which, btw is changing its name to just Tesla – guess they want to be more than just a car company), self-driving vehicles have arrived. And, even though a driver (passenger?) was killed in one of their cars a while ago, Tesla said it was just an isolated case and the software has been updated be more effective.
I have long argued that a truly self-driving automobile requires more than just an array of sensors on it, and relying on them to be able to assess the surroundings. It has to be a two-way street – the vehicle needs to not just look at what is going on around it, but also receive input from those elements around it. And here is one good reason why we cannot just rely on the vehicle being smart – ever.
IEEE presented a very eye-opening perspective on self-driving car technology in a recent disengagement report, which details every time a human safety driver had to quickly take control of their car, either due to hardware or software failure or because the driver suspected a problem. The report revealed 2,578 instances where there was a malfunction in a self-driving car’s technology. The report is based on self-driving cars in California.
The report includes nine companies across which the disengagements occurred (see below).
Goggle, to no one’s surprise, has the most total miles logged. It also has the least number of disengagements per mile. When one looks at it, it would seem that 2 disengagements per 10K miles isn’t bad. While sources vary some, the average Californian drives almost 14K mile per year, that means a malfunction will occur every nine or so months giving it a reliability of roughly five nines (moving decimal points give 0.0002 malfunctions per mile; 100-0.0002=99.9998). But at the other end of the scale, Bosh has 1.4 disengagements per mile. The second place contender has 0.0157 which is seven times as often as Google. And the same math comes to be almost five nines. But if one looks at it from a worst case scenario (which, of course isn’t the case here), potentially, if each disengagement causes a death. Then five nines really isn’t good enough, especially because there isn’t any back up.
By backup I mean like in computing. And there, people would laugh at five nines. Now-a-days, computer chips run for years and GHz speeds and rarely hiccup. That is like, two gazillion, gazillion gazillion nines. And even if a disengagement occurs, there is usually backup (at least from the smart set). And yet computer errors are even fatal from time to time.
But here is the best part. Just like the government, disengagements are defined by the “define.”
For instance, Google’s Waymo (it new moniker for self-driving cars) does not count every single time the driver grabs the wheel to take over from the robotic chauffeur. That according to Google, happens many thousands of times annually. Google, only reports disengagements where the car would have done something unsafe. It calculates that if its drivers had taken no action at all, nine disengagements in 2016 would have led to the car hitting an obstacle or another road user, potentially ending up in a fatality
However, and leave it to Tesla, the most mysterious disengagement report comes from Tesla. In 2015, the company reported no disengagements at all, suggesting that it either carried out no public testing in California or that its cars were flawless. In 2016 its report shows 182 disengagements in 550 miles of autonomous driving.
However, all but a handful of those disengagements happened in just four cars over the course of a single long weekend in October, possibly during the filming of a promotional video. Who knows what the real Tesla numbers are? But so far they are the only company with a fatality – even if the driver should have been more aware.
And not all states require a testing permit that produces somewhat standardized required reports. States like Arizona and Pennsylvania do not require companies to report disengagements or failures.
To me, I’m sticking to my original guns. Autonomous vehicles require as foolproof an infrastructure as possible. Yes, there will always be some “disengagements.” But until both the vehicle and the infrastructure becomes smart and joined at the hip, limitations will be required. When the disengagements hit the gazillions of nines, then MAYBE I’ll take a nap behind the wheel.