April 21, 2022
Tesla has increasingly been under the microscope over the effectiveness of its self-driving technology. And now an American billionaire is running for Senate with the sole intention of getting the automaker’s Full Self Driving (FSD) beta software banned.
News of Dan O’Dowd’s bid follows a report highlighting how the National Highway Traffic Safety Administration has launched two formal defect investigations into Tesla’s Autopilot software.
O’Dowd, the California-based CEO of Green Hills Software and founder of The Dawn Project, is campaigning to “make computers safe for humanity.” And his chief target is FSD, optional tech that costs $12,000 in the U.S. and is designed to complement Tesla’s standard fit autopilot driver-assistance system. Despite the name, “Full Self Driving” does not yet deliver full autonomous driving capability.
“The first danger I am tackling is @ElonMusk’s reckless deployment of @Tesla’s Full Self-Driving Cars on our roads,” O’Dowd said In a tweet published earlier this week.
The post was accompanied by a link to his campaign website and a video that showcased several clips of FSD making errors that almost resulted in crashes, with a voiceover stating: “Often the Tesla doesn’t know what it wants to do.” The video finishes with the claim that FSD is “unsafe at any speed.”
Tesla CEO Elon Musk recently said that more than 100,000 people are involved in the FSD “beta program,” but not only is O’Dowd unhappy with the tech, he disputes the terminology too, tweeting: “Any engineer knows that rollout of @Tesla FSD to all FSD customers is not a *beta* release – it is a full product release.”
The latest version of FSD is designed to automate driving on highways as well as in urban environments, but is still only at Level 2, as defined by the Society of Automotive Engineers, and requires the driver to be ready to take control at all times. Musk has previously claimed that Tesla will achieve Level 4 autonomy in 2022.
At the same time as O’Dowd is highlighting FSD’s alleged shortcomings, Bloomberg revealed that the NHTSA is conducting two separate probes into Tesla’s Autopilot tech. The report highlighted a fatal accident in Indiana in 2019 involving a Tesla running on Autopilot as one of a dozen in the past four years where one of the company’s vehicles collided with a first responder vehicle, and quoted Jennifer Homendy, chair of the National Transportation Safety Board, as saying: “We essentially have the Wild West on our roads right now. It’s a disaster waiting to happen.”
The statistics seem to back up this concern. Of 31 NHTSA investigations into crashes involving driver-assistance systems, 24 involved Teslas. Critics point out that, unlike General Motors and Ford, Tesla does not employ cameras behind the steering wheel to monitor whether drivers are paying attention and does not restrict the use of its tech to highways that the company has mapped out and tested.
About the Author(s)
You May Also Like