The company continues to focus on cameras for its driver-assistance tech

Graham Hope

October 6, 2022

3 Min Read
Image shows the inside of a Tesla Model X P90D full electric luxury crossover SUV car with a large touch screen and dashboard
Getty Images

Tesla is removing ultrasonic sensors (USS) from new versions of some of its vehicles, as it continues to focus on cameras for its Autopilot driver-assistance tech.

Starting this month, all Model 3 and Model Y vehicles bound for North America, Europe, the Middle East and Taiwan will no longer include the 12 sensors found on the front and rear bumpers. The Model S and Model X will follow in 2023.

The news came in a support update on the Tesla website.

The sensors, which measure distance via ultrasonic waves, are mainly used to detect close objects and in parking applications. 

Tesla’s decision is yet further evidence of its intention to follow its own path in developing its advanced driver-assistance systems (ADAS) and runs contrary to what much of the rest of the industry is doing. Increasingly, automakers are relying on a wider array of sensors, including cameras, radar and lidar, to deliver automated functionality.

Tesla, instead, is focusing on what it terms Tesla Vision, a process that started in early 2021 when it confirmed it was removing radar from Model 3 and Model Y vehicles in North America. It has followed this up by also ditching radar on other models and in other markets.

Tesla Vision is based on the company’s belief that roads are designed for humans who navigate them using a vision-based system, and that autonomous cars should learn to drive in a similar fashion, by employing cameras and artificial neural nets.

CEO Elon Musk gave an insight into this thinking in February when he tweeted: “LIDAR is a seductive local maximum. However, the road system was designed to work with biological neural nets & eyes, so a general solution to self-driving necessarily will require silicon neural nets & cameras. Real-world AI.”

As well as removing the ultrasonic sensors, Tesla says it has launched a “vision-based occupancy network” to replace the inputs generated by them. This is currently being used in Full-Self Driving (FSD) Beta – an optional, slightly more advanced version of the company’s ADAS tech – and will endow Autopilot with superior long-range visibility and the ability to identify and differentiate between objects.

In the short term, though, the removal of the sensors will also see some features rendered “temporarily limited or inactive”. 

The affected functions are Park Assist, which alerts owners of surrounding objects at speeds less than 5 mph; Autopark, which automatically maneuvers into parallel or perpendicular parking spaces; Summon, which manually moves a vehicle forward or in reverse via the Tesla app; and Smart Summon, which navigates the vehicle to a location via the Tesla app.

All will eventually be restored via over-the-air software updates.

The news comes at a time of increased scrutiny for Tesla’s driver-assistance tech. Recent months have seen class action lawsuits filed over the capabilities of Autopilot and FSD, a series of National Highway Traffic Safety Administration investigations into fatal crashes and complaints about false advertising.

About the Author(s)

Graham Hope

Graham Hope has worked in automotive journalism in the U.K. for 26 years, including spells as editor of leading consumer news website and weekly Auto Express and respected buying guide CarBuyer.

Sign Up for the Newsletter
The most up-to-date news and insights into the latest emerging technologies ... delivered right to your inbox!

You May Also Like