Tesla fires worker who reviewed its full self-driving feature on YouTube

San Francisco, March 16 (IANS) Elon Musk-run Tesla has fired an employee who reviewed the electric car-maker’s full self-driving (FSD) beta software on his YouTube channel.

John Bernal posted the video that showed his Tesla hitting a bollard on his YouTube channel AI Addict.

As reported by CNBC, Bernal said that prior to his dismissal, he was told verbally by his managers that he “broke Tesla policy” and that his YouTube channel was a “conflict of interest”.

However, his written separation notice did not specify a reason for his dismissal, reports The Verge.

The video had more than 2,50,000 views and was shared widely on social networks like Twitter.

Bernal said that after posting the video, “A manager from my Autopilot team tried to dissuade me from posting any negative or critical content in the future that involved FSD Beta. They held a video conference with me but never put anything in writing.”

Tesla’s social media policy for employees does not forbid criticism of the company’s products in public, but says that the company “relies on the common sense and good judgment of its employees to engage in responsible social media activity”.

Bernal says that after being fired, his access to the FSD Beta software was revoked.

Meanwhile, the US senators have rejected Elon Musk-run Tesla’s claim that its autopilot and FSD features are safe for driving, saying this is just “more evasion and deflection from Tesla”.

Rohan Patel, Senior Director of Public Policy at Tesla, wrote in a letter to the US Senators Richard Blumenthal (D-CT) and Ed Markey (D-MA) that Tesla’s autopilot and FSD capability features “enhance the ability of our customers to drive safer than the average driver in the US.

Patel responded to the Senators, who had raised “significant concerns” about autopilot and FSD. They also urged federal regulators to crack down on Tesla to prevent further misuse of the company’s advanced driver-assist features.

The FSD beta mode recently resulted in a Tesla Model Y crashing in Los Angeles.

No one was injured in the crash, but the vehicle was reportedly “severely damaged”.

The crash was reported to the National Highway Traffic Safety Administration (NHTSA), which has multiple and overlapping investigations into Tesla’s autopilot system.

Tesla FSD beta aims to enable Tesla vehicles to virtually drive themselves both on highways and city streets by simply entering a location in the navigation system, but it is still considered a level-2 driver assist since it requires driver supervision at all times.

The driver remains responsible for the vehicle, and needs to keep their hands on the steering wheel and be ready to take control.

There have been several Tesla Autopilot-related crashes, currently under investigation by the US NHTSA.

Follow AsumeTech on

More From Category

More Stories Today

Leave a Reply