Tesla has recently recalled 362,758 self-driving cars admitting their Full Self-Driving (FSD) beta software is to blame for several crashes. The Tesla models involved in the recall include the 2016-2023 Model X and Model S, the 2017-2023 Model 3, and the 2020-2023 Model Y with the FSD beta. Multiple videos show Teslas with FSD beta capability rendered useless and getting into accidents.

Tesla officials say the accidents most often occurred at intersections. The technology apparently doesn’t acknowledge stops signs and rolls through them instead of coming to a complete stop. Teslas with the FSD beta also have issues with steady yellow lights at an intersection where it does not approach with caution. They have also crossed double yellow lines entering the lanes of incoming traffic.

This is not the first problem that has risen from the electric car kingpin and had led to a great deal of criticism and concern from politicians and the public.

Related: The Brutal Truth About Owning a Tesla

Elon Musk Doesn't Agree With The Word "Recall"

This is a substantial recall considering 400,000 Tesla vehicles have Self-driving technology, and there have been over 360,000 units recalled. The mass recall is a hard fact to ignore. However, Elon Reeve Musk, the CEO of Tesla, claims that the term is misleading.

According to Musk, the solution is as simple as a software update. The company will release this update free of charge as over-the-air (OTA) software. A law professor out of the University of South Carolina, Bryant Walker Smith, claims the more proper term may be a “virtual recall”. This situation is leading us into a new realm of law.

There is very little precedence for cases involving self-driving cars and the ramifications associated with their faults. With the rise of deep fakes and ChatGPT, the FSD technology is another facet of this legal playground, which has to see more time and scrutiny to ensure its safety.

Restrictions To Tesla's Self-Driving Feature

Tesla Model S in red front third quarter accelerating view
Via: Tesla

Although the Full Self-Driving Beta is new, it requires a reasonable amount of vetting. It's reserved for those who have an active FSD driver assistance subscription and have a high driver safety score, which is determined through careful monitoring.

Even with a high driver safety score, Elon Musk recommends that drivers take caution. If they have the FSD beta, they should always be ready to brake and take control of steering in the case of system failures. The system may not give the proper amount of warning time or may fail altogether. After the recall announcement, the notes stated that a driver takes full responsibility in the case of an accident when the feature is active.

The issue is that the FSD is yet another promise that Elon Musk is yet to deliver. This has led to a lot of criticism surrounding how Musk has dealt with promises in the past. It also raises concerns about whether the FSD is amenable and will either make or break the future for Tesla.

The popular electric car company has many questionable prospects. This also includes the new Cybertruck, which still doesn't have a solid release date. The self-driving function has been an enticing selling point for the company, and this recall by Tesla has raised multiple doubts as to its functionality.

Related: Why It's Worth Waiting for the Updated 2023 Tesla Model 3

Tesla Auto Pilot Accidents Are Not New

Tesla Model S accident on the highway front third quarter aftermath view
Via: Shutterstock

These are not the first cases of Tesla’s automation being an issue. Back in 2018, when the autopilot function was in its beginning stages, both the autopilot function and driver error were to blame for an accident. A Tesla, going 31 mph, crashed into a parked firetruck when a car in front of the Tesla merged, revealing the parked truck. There were warnings given, but the driver didn't correct them in time. He was apparently eating a bagel and drinking a coffee at the time.

Another fatal accident occurred when a Tesla Model X hit a median. It killed the driver Walter Huang, an Apple engineer. This accident was fatal because the Tesla accelerated. This resulted in a deeper impact which breached the battery, causing a fire. Apparently, there were multiple warnings to brake and steer out of the way before impact, none of which were heeded in time.

Last year over 50,000 Teslas were recalled with this FSD technology because they didn’t properly stop at stop signs. In fact, there have been three dozen investigations into Tesla crashes and 19 deaths reported from driver assistance system malfunctioning. They were opened by the NHTSA (National Highway Traffic Safety Administration).

After all, is said and done, Tesla executives point out that billions of miles have been driven on the FSD beta without any issues. They stated that it's still fewer than accidents occurring from solely driver error.

Even after the recall, Tesla has not taken much of a hit. Their stocks fell 5.7% and bounced right back up within a day. This proves there may be more faith in the company financially than what we would gather. The recalled Teslas are still on the road, and the company has given no date to establish when they would be fixed.