Both the National Highway Traffic Safety Administration—the government agency that oversees car safety—and the National Transportation Safety Board—an independent agency that investigates noteworthy incidents—have dispatched teams to Texas to study the crash. “We are actively engaged with local law enforcement and Tesla to learn more about the details of the crash and will take appropriate steps when we have more information,” NHTSA said in a statement. It will likely be weeks, if not months, before results of an investigation are released.

Still, the incident again highlights the still-yawning gap between Tesla’s marketing of its technology and its true capabilities, highlighted in in-car dialog boxes and owners’ manuals.

Image may contain: Vehicle, Transportation, Car, Automobile, Sedan, Sports Car, and Race Car

There’s a small cottage industry of videos on platforms like YouTube and TikTok where people try to “fool” Autopilot into driving without an attentive driver in the front seat; some videos show people “sleeping” in the back or behind the wheel. Tesla owners have even demonstrated that, once the driver’s seat belt is secured, someone can prompt a car in Autopilot mode to travel for a few seconds without anyone behind the wheel.

Tesla—and particularly Musk—have a mixed history of public statements about Full Self-Driving and Autopilot. A Tesla on Autopilot issues visible and audible warnings to drivers if its sensors do not detect the pressure of their hands on the wheel every 30 or so seconds, and it will come to a stop if it doesn’t sense hands for a minute. But during a 60 Minutes appearance in 2018, Musk sat behind the wheel of a moving Model 3, leaned back, and put his hands in his lap. “Now you’re not driving at all,” the anchor said with surprise.

This month, Musk told the podcaster Joe Rogan, “I think Autopilot’s getting good enough that you won’t need to drive most of the time unless you really want to.” The CEO has also repeatedly given optimistic assessments of his company’s progress in autonomous driving. In 2019 he promised Tesla would have 1 million robotaxis on the road by the end of 2020. But in the fall of 2020, company representatives wrote to the California Department of Motor Vehicles to assure them that the Full Self-Driving will “remain largely unchanged in the future,” and that FSD will remain an “advanced driver-assistance feature” rather than an autonomous one.

Thus far, FSD has only been released to 1,000 or so participants of the company’s Beta testing program. “Still be careful, but it’s getting mature,” Musk tweeted last month to FSD Beta testers.

At least three people have died in fatal crashes involving Autopilot. After an investigation into a fatal 2018 crash in Mountain View, California, the NTSB asked the federal government and Tesla to ensure that drivers can only operate Tesla’s automated safety features in places where they are safe. It also recommended Tesla install a more robust monitoring system, to make sure drivers are paying attention to the road. General Motors, for example, will only allow users of its automated SuperCruise feature to operate on pre-mapped roads. A driver-facing camera also detects whether drivers’ eyes are pointing toward the road.

A spokesperson for NHTSA says the agency has opened investigations into 28 Tesla-related crash incidents.

Data released by Tesla suggests the vehicles are safer than the average US car. On Saturday, just hours before the fatal Texas crash, Musk tweeted that Teslas with Autopilot engaged are almost 10 times less likely to crash than the average vehicle, as measured by federal data. But experts point out that the comparison isn’t quite apt. Autopilot is only supposed to be used on highways, while the federal data captures all kinds of driving conditions. And Teslas are heavy, luxury cars, which means they’re safer in a crash.


More Great WIRED Stories

Premium black unisex t shirt.