Elon Musk has pledged that the work of his so-called Department of Government Efficiency, or DOGE, would be âmaximally transparent.â DOGEâs website is proof of that, the Tesla and SpaceX CEO, and now White House adviser, has repeatedly said. There, the group maintains a list of slashed grants and budgets, a running tally of its work.
But in recent weeks, The New York Times reported that DOGE has not only posted major mistakes to the websiteâcrediting DOGE, for example, with saving $8 billion when the contract canceled was for $8 million and had already paid out $2.5 millionâbut also worked to obfuscate those mistakes after the fact, deleting identifying details about DOGEâs cuts from the website, and later even from its code, that made them easy for the public to verify and track.
For road-safety researchers who have been following Musk for years, the modus operandi feels familiar. DOGE âput out some numbers, they didnât smell good, they switched things around,â alleges Noah Goodall, an independent transportation researcher. âThat screamed Tesla. You get the feeling theyâre not really interested in the truth.â
For nearly a decade, Goodall and others have been tracking Teslaâs public releases on its Autopilot and Full Self-Driving features, advanced driver-assistance systems designed to make driving less stressful and more safe. Over the years, researchers claim, Tesla has released safety statistics without proper context; promoted numbers that are impossible for outside experts to verify; touted favorable safety statistics that were later proved misleading; and even changed already-released safety statistics retroactively. The numbers have been so inconsistent that Tesla Full Self-Driving fans have taken to crowdsourcing performance data themselves.
Instead of public data releases, âwhat we have is these little snippets that, when researchers look into them in context, seem really suspicious,â alleges Bryant Walker Smith, a law professor and engineer who studies autonomous vehicles at the University of South Carolina.
Government-Aided Whoopsie
Teslaâs first and most public number mix-up came in 2018, when it released its first Autopilot safety figures after the first known death of a driver using Autopilot. Immediately, researchers noted that while the numbers seemed to show that drivers using Autopilot were much less likely to crash than other Americans on the road, the figures lacked critical context.
At the time, Autopilot combined adaptive cruise control, which maintains a set distance between the Tesla and the vehicle in front of it, and steering assistance, which keeps the car centered between lane markings. But the comparison didnât control for type of car (luxury vehicles, the only kind Tesla made at the time, are less likely to crash than others), the person driving the car (Tesla owners were more likely to be affluent and older, and thus less likely to crash), or the types of roads where Teslas were driving (Autopilot operated only on divided highways, but crashes are more likely to occur on rural roads, and especially connector and local ones).
The confusion didnât stop there. In response to the fatal Autopilot crash, Tesla did hand over some safety numbers to the National Highway Traffic Safety Administration, the nationâs road safety regulator. Using those figures, the NHTSA published a report indicating that Autopilot led to a 40 percent reduction in crashes. Tesla promoted the favorable statistic, even citing it when, in 2018, another person died while using Autopilot.
+ There are no comments
Add yours