Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!
July 20, 2022
The judgment, by the Munich I District Court, followed an analysis of the vehicle in a specially commissioned report which found that it could not be relied upon to recognize or identify obstacles in the road – in particular the narrowing of a road in a construction zone – and suffered from issues of phantom braking, where the brakes were activated unnecessarily.
The German report described the sudden braking as a “massive hazard” in urban traffic and said it could cause rear-end collisions.
According to the German outlet Der Spiegel, which broke the news of the court ruling, Tesla’s lawyers claimed that Autopilot was not intended for use in city traffic. However, this argument was not accepted by the court, which stated that it was not feasible to expect Tesla drivers to have to manually switch the feature on and off depending on the driving scenario, as this could prove a distraction.
The Munich judgment could set a difficult precedent for Tesla, coming after months of increased scrutiny of Autopilot, with much of the debate centering around Tesla drivers’ potential over-reliance on it.
It is classed as a Level 2 automated system by the Society of Automotive Engineers, which means it requires human drivers to monitor operation at all times. Autopilot is currently standard on all Teslas, while more advanced tech in the form of Enhanced Autopilot and Full Self-Driving is available for purchase.
Despite the potentially confusing branding, the Tesla website makes clear: “Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment. While these features are designed to become more capable over time, the currently enabled features do not make the vehicle autonomous.”
But in the United States, in particular, concern over the tech and how it is being used has been growing.
The National Highway Traffic Safety Administration (NHTSA) has escalated an investigation into several collisions involving Teslas running Autopilot and first responder vehicles stopped for emergencies on or beside the road. If the probe further advances, a recall could be ordered covering around 830,000 Teslas built between 2014 and 2021.
Additionally, a number of fatal crashes involving Autopilot-equipped Teslas are being investigated by the agency.
And the concerns in Germany over potential phantom braking have also been mirrored in the U.S., with more than 750 owners complaining about the problem. The NHTSA has written to Tesla seeking information on the issue.
You May Also Like