Tesla is one of those companies that likes to push the envelope. That, in itself, is not a bad thing unless it involves life safety. One of the other elements of the Tessla community is that the users also like to push the envelope. That became evident in that infamous case of the 2016 Tesla that crashed and killed the driver when the NTSB ruled that Joshua Brown was over-reliant on the vehicle’s autopilot. There was also a similar incident involving a Tesla in China.
The NTSB found Tesla blameless, except that it said, “System safeguards were lacking.” That kind of highlights overconfidence on the part of the driver and Tesla.
To have truly level-five autonomous vehicles, with performance equal to a human driver, it cannot be just one-way data. In other words, the vehicle must be connected to the infrastructure – not just rely on sensors, regardless of how sophisticated they may be. But Tesla claims that level five is possible with its current core technologies.
I have doubted that was possible and now, GM agrees with me – yay! According to Scott Miller, director of autonomous vehicle integration at GM, said recently, “The level of technology and knowing what it takes to do the mission, to say you can be a full Level Five with just cameras and radars is not physically possible.”
The engineers at Tesla, as opposed to the other players, seems to think that it does not even need LIDAR (light detection and ranging), which uses lazers to sense surroundings. To power the latest generation of the Tesla autopilot and self-driving technology, they are comfortable with a platform called “Tesla vision” – which is an end-to-end computer vision system built with NVIDIA’s CUDA, a parallel computing platform (http://www.electronicdesign.com/automotive/nvidia-girds-computer-system-fully-autonomous-cars).
While CUDA is an advanced platform that is capable of fast I/O and interpretation of visual data, it is still just a camera back end. Eventually Tesla placed more emphasis on radar and less on computer vision.
GM is not the only automaker to advise caution about the hype around self-driving cars. One of the big stops here is that computers have made gigantic strides in image recognition. However, the algorithms needed to turn image recognition into visual knowledge – the ability to understand not only objects, but their behavior as well – are still rather immature.
That, as Paul Harvey used to say, is the rest of the story.
Ernest Worthman is Executive Editor/Applied Wireless Technology. His 20-plus years of editorial experience includes being the Editorial Director of Wireless Design and Development and Fiber Optic Technology, the Editor of RF Design, the Technical Editor of Communications Magazine, Cellular Business, Global Communications and a Contributing Technical Editor to Mobile Radio Technology, Satellite Communications, as well as computer-related periodicals such as Windows NT. His technical writing practice client list includes RF Industries, GLOBALFOUNDRIES, Agilent Technologies, Advanced Linear Devices, Ceitec, SA, and others. Before becoming exclusive to publishing, he was a computer consultant and regularly taught courses and seminars in applications software, hardware technology, operating systems, and electronics. Ernest’s client list has included Lucent Technologies, Jones Intercable, Qwest, City and County of Denver, TCI, Sandia National Labs, Goldman Sachs, and other businesses. His credentials include a BS, Electronic Engineering Technology; A.A.S, Electronic Digital Technology. He has held a Colorado Post-Secondary/Adult teaching credential, member of IBM’s Software Developers Assistance Program and Independent Vendor League, a Microsoft Solutions Provider Partner, and a life member of the IEEE. He has been certified as an IBM Certified OS2 consultant and trainer; WordPerfect Corporation Developer/Consultant and Lotus Development Corporation Developer/Consultant. He was also a first-class FCC technician in the early days of radio. Ernest Worthman may be contacted at: email@example.com.It it good to see cool heads starting to discuss the reality of autonomous vehicles, much like reality checks are coming in 5G.