Tesla requires Full Self Driving testers to allow video collection in case of a crash

·2-min read

With Tesla's latest FSD ("Full Self-Driving") release, it's asking drivers to consent to allowing it to collect video taken by a car's exterior and interior cameras in case of an accident or "serious safety risk." That will mark the first time Tesla will attach footage to a specific vehicle and driver, according to an Electrek report.

Tesla has gathered video footage as part of FSD before, but it was only used to train and improve its AI self-driving systems. According to the new agreement, however, Tesla will now be able to associate video to specific vehicles. "By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision," the agreement reads.

By enabling FSD Beta, I consent to Tesla’s collection of VIN-associated image data from the vehicle’s external cameras and Cabin Camera in the occurrence of a serious safety risk or a safety event like a collision.

As Electrek notes, the language could indicate that Tesla wants to ensure it has evidence in case its FSD system is blamed for an accident. It could possibly also be used to detect and fix serious issues more quickly.

FSD 10.3 was released more widely than previous betas, but it was quickly pulled back due to issues like unwarranted Forward Collision Warnings, unexpected autobraking and more. At the time, CEO Elon Musk tweeted that such issues are "to be expected with beta software," adding that "it is impossible to test all hardware configs in all conditions with internal QA, hence public tests."

However, other drivers on public roads are unwitting beta testers, too. The National Highway Traffic Safety Administration is currently investigating a driver's complaint that FSD led to a November 3rd collision in Brea, California. The owner alleged that it caused his Model Y to enter the wrong lane and hit another car, causing considerable damage to both.

Tesla is releasing the new beta to even more users with Driver Safety Scores of 98 and up — previously, beta releases were limited to drivers with perfect 100 scores. Tesla charges drivers $199 per month for the feature or $10,000 in one shot, but has failed meet promised deadlines for autonomous driving. Currently, the FSD system is considered to be a Level 2 system — far from the Level 4 required to really be "full self-driving."

Editor's note: This article originally appeared on Engadget.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting