Most, if not all of us have, by now, heard of the fatal crash of the Tesla a couple of weeks ago in Texas. This is not the first time Tesla has been in the news around its self-driving technology nor the first time it has had fatalities. So, I thought I would take the liberty of using a shocking headline, as many of the reporting media has done. However, before I go any further know that I am putting blame on both Tesla and the occupants.
If one does a data search around Tesla accidents one will find that Tesla was involved in 50 fatal crashes in 2019. Of those, the National Highway Traffic Safety Administration (NHTSA) is investigating 28 of them with 14 focusing on the autopilot. And, of course, this one is also being investigated as an autopilot failure – at least if one hears or reads the sensationalistic headlines around the accident.
But these headlines miss the point. Whether or not Tesla’s autopilot technology was fully engaged, or even available, is secondary to the poor judgement of the individuals involved. I have a hard time understanding how anyone is foolish, or ignorant enough to vacate the driver’s seat while the vehicle is in motion, regardless of how advanced the vehicle is.
It is not like information about the Tesla’s self-driving mode is sketchy. Tesla makes it very plain that there should always be someone ready to take the controls and never trust the vehicle to run without supervision. There are extensive writings in the Tesla owner’s manual describing when, where, and how the features should be used. It is made painfully clear that the driver must be fully attentive and holding the steering wheel, being “mindful of road conditions and surrounding traffic.”
It states further that the driver should not be near city streets, construction zones, bicyclists, or pedestrians. It is even on the website as, “The currently enabled features require active driver supervision and do not make the vehicle autonomous.” This one is on the individuals, not Tesla – for the most part.
However, Tesla, and Elon Musk particularly, are not lily white in honesty about their autopilot mode either. All else aside there should be a failsafe condition that prevents the car from moving if there is no one in the driver’s seat – period! And if someone vacates the seat when the car is in any self-driving mode it immediately pulls over and shuts off. And while we are at it, notifies the authorities, as well.
What makes this murky is Musk has been, shall we say, rather robust in his preaching’s of Tesla’s Full Self-Driving Capability (FSD) mode. For example, recently he was quoted as saying, “I think autopilot’s getting good enough that you will not need to drive most of the time unless you really want to.” Another remark, along the same lines, is Musk promising, back in 2019, that Tesla would have one million robotaxis on the road by the end of 2020. But in the fall of 2020, the company noted that full self-driving will “remain largely unchanged in the future,” and that FSD will remain an “advanced driver-assistance feature” rather than an autonomous one.
So, there is that still-yawning gap between Tesla’s marketing of its technology and its true capabilities. And, like most marketing diatribe, Tesla’s FSD does not live up to the hype. In fact, Tesla’s associate general counsel, Eric Williams said, in a statement to the California Department of Motor Vehicles, that the features are only level 2 autonomous.
For the longest time I have been skeptical of the claims that fully autonomous vehicles can exist today anywhere other than contained areas. This just adds more proof to the fact that we are not as far along the road as much of the hype claims. In fact, the most advanced cases of autonomous vehicles are level 3, maybe level 3.5.
There are some concerning issues that this accident points out, which seem to be systemic. The first is how can this vehicle even get on the road without someone in the driver’s seat, with or without the full auto-drive package (which adds $10,000 to the vehicles price, BTW).
The second is how anyone can buy one of these without being fully vetted in just what FSD is capable of. The premise that purchasers and potential drivers are going to read the manual before they drive is presumptuous. The autopilot mode cannot be left to the driver to decipher.
The third is that, like 5G, the autonomous vehicle industry is hyping the platform rather than being entirely honest about what it can do and where we are in the development.
There certainly is evidence that vehicles with various levels of autonomy are safer than those without. Tesla’s marketing claims that public data suggests its vehicles are safer than the average vehicle. In fact, just before the fatal Texas crash, Musk said that Tesla is, with their autopilot engaged, nearly 10 times less likely to crash than the average vehicle. This data was the result of measured results by the feds.
However, back to the marketing perspective. While the data may be accurate, the conditions, under which Tesla claims it, are skewed. The Tesla data is bounded by highway driving conditions only. The federal data captures all kinds of driving conditions. So, Tesla picks and chooses certain conditions from the whole data set to make its data seem more relevant than it really is.
Back to the Texas accident for a moment. One of the concerns I have is that, according to Musk, the autopilot was not engaged at the time of the accident. As well, the vehicle did not have the FSD package. So how was it that the occupants were able to vacate the driver’s seat and the vehicle continued to move. Other questions arise such as did the vehicle continue to drive autonomously or did it immediately run off the road? And what was going on inside the cabin? Were they partying or just testing the car’s limits? It will be interesting to see the forensic report once the investigation is completed.
That brings me back to my earlier statement of “what were these occupants thinking???” It also brings me back to the point of how the vehicle could even operate without a driver in the driver’s seat. So, the real issue becomes that any type of self-driving vehicle needs to babysit the occupants, not only for their own safety but other drivers and vehicles, as well. In this case the passengers were the only victims, fortunately. One cannot ignore the ignorance of some drivers and that needs to be dealt with in such a way that if this kind of behavior occurs it isolates it and prevents any peripheral damage.
That opens up a whole new pandora’s box with the primary concern of monitoring the occupants, which, tangentially puts a bullseye right on the most pressing issue beyond safety – privacy and security.
I will tantalize my readers with this in closing. On April 14, 2021, the FCC issued an Order waiving its Section 15.255 technical and service rules for unlicensed operation in the 57-71 GHz band to permit six equipment manufacturers to operate radar-based vehicle cabin monitors. Stay tuned, we will drill down on that in my next column.