On Monday, Elon Musk admitted that version 9.2 of Tesla’s Full Self-Driving (FSD) Beta was “not great.” He also said that the Autopilot/AI team was working hard to improve it as quickly as possible. He said that the company was looking at a single stack that would work in both highways and city streets. However, it would require a huge neural network (NN) retraining.
The tweet from entrepreneur Musk comes after U.S. safety regulators are investigating several Tesla crashes that could have occurred when the electric vehicle was on autopilot. In the U.S. Tesla sells the FSD package at $10,000 or at $199 per month. This package has been marketed as an upgrade to Autopilot, which is a slew of advanced driver-assistance system features.
The FSD Beta features are said to be available to only those who had previously bought FSD and to Tesla employees. The feature reportedly can perform the following functions without a driver behind the wheel:
- automatically change lanes
- navigate on a highway
- drive into a parking spot.
The electric vehicle giant also said that the FSD Beta version should be able to automatically steer on city streets, later this year. Another possible feature touted by Musk mentions that it would also have the ability to reason why a lead vehicle was slow, if there was a hold up in traffic.
Tesla as well as other manufacturers always mention that drivers who used the self-drive systems should be ready to travel intervene at all times. However, if this intervention had been followed, perhaps there might not have been so many accidents involving Tesla when this feature has been used. U.S. safety regulators have opened 11 investigations into accidents where the system reportedly missed spotting parked emergency vehicles. The accidents accounted for one life lost and left at least 17 people with injuries.
According to CNBC the driver assistance system did not make Tesla electric vehicles safe to use, if there wasn’t an attentive driver behind the wheel.