Before jumping into today’s Data Sheet, I wanted to highlight news that Microsoft announced plans to cut 10,000 jobs, or roughly 5% of its workforce, joining the ever-growing list of tech giants right-sizing their headcount.
At this point, there’s not much new to say: Companies over-hired in the throes of the pandemic, they now want to cut costs in the face of an uncertain economy, and the whole thing stinks for everyone involved. Let’s just hope we’re nearing the end of the worst of it.
Now onto the rest of Data Sheet.
A little over six years ago, Tesla released and heavily promoted a video that suggested the company’s cars had reached long-awaited autonomy.
The nearly four-minute reel kicked off with a message, foretelling footage of a car “driving itself.” Then, viewers saw a bird’s eye view of a self-driving Tesla flawlessly navigating side streets, a highway, and a parking lot as the Rolling Stones’ “Paint It, Black” played in the background. Although a man sat in the front seat of the vehicle, he never took control of the steering wheel, let alone put his hands at 10-and-2.
The video heralded the arrival of a landmark achievement by Tesla, with CEO Elon Musk tweeting that “Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot.”
But according to testimony last year by a high-ranking Tesla executive, unearthed Tuesday by Reuters, the ad was misleading at best. In a July 2022 deposition, Tesla Autopilot software director Ashok Elluswamy said the car made mistakes and crashed into a fence on test runs along the path ultimately depicted in the video. Tesla and Musk never mentioned these foibles in the promotional video or public comments.
“The intent of the video was not to accurately portray what was available for customers in 2016,” Elluswamy said at the deposition. “It was to portray what was possible to build into the system.”
The video crystallized what has now become a common complaint about Tesla and self-driving: that Musk and other company executives exaggerated the company’s technical capabilities and withheld information that would have dampened enthusiasm about the technology.
For years, Tesla has largely avoided any legal repercussions for its actions. But 2023 should prove a watershed year in determining whether Tesla’s deeds amounted to legally protected speech, careless embellishment, violations of civil and criminal law, or something in between.
On the civil court side, Tesla is expected in February or March to face its first trial seeking to hold the company liable for the performance of its self-driving technology, dubbed Autopilot by the company, leading up to a fatal crash.
The case stems from a deadly collision in Florida involving a Tesla Model 3 driven by 50-year-old Jeremy Banner, who slammed into a tractor-trailer while Autopilot was engaged. Crash investigators faulted Banner for inattention while driving, though federal officials said Autopilot failed to warn Banner about an imminent danger.
Should the two sides spurn a settlement and take the case to trial, a verdict would test whether jurors believe Tesla’s marketing gave drivers a false sense of security on the road. A trial also could expose previously undisclosed information held by Tesla about its self-driving technology capabilities and the ethics of its advertising.
“A big part of the significance of the case is that it actually is being conducted in a public forum,” Michael Brooks, then-chief counsel at the Center for Auto Safety, a consumer advocacy group, told Bloomberg in September.
Another Autopilot-related trial related to a fatal crash is tentatively scheduled for late March. The family of Walter Huang alleges that Tesla’s self-driving technology was fundamentally unsafe at the time of the 38-year-old’s single-car crash in Mountain View, Calif. Federal investigators said Huang’s Tesla did not detect his hands on the wheel at impact, and cellular data suggests he might have been playing a game on his iPhone. (Elluswamy’s testimony about the 2016 Tesla video came during a deposition related to the Huang family’s lawsuit.)
On the criminal front, Tesla and its executives should learn more this year about a criminal probe opened in 2021 by the U.S. Department of Justice. Reuters, citing sources familiar with the matter, reported that the investigators are examining whether Tesla officials “misled consumers, investors and regulators by making unsupported claims about its driver assistance technology's capabilities.” The inquiry followed a dozen-plus crashes, some of them fatal, in which Autopilot was engaged.
Finally, regulatory inquiries launched by federal highway safety officials and the California Department of Motor Vehicles should kick into overdrive this year. Both sides are reviewing the safety of Tesla’s Autopilot-equipped cars, with California officials taking a particular look at whether the electric automaker engaged in false advertising related to self-driving technology. While it’s unclear whether either probe will result in regulatory action in 2023, both inquiries could produce new information released through legal filings or media leaks.
Tesla officials have steadfastly maintained that drivers are responsible for remaining attentive while Autopilot is engaged, generally absolving them of legal culpability when crashes occur. A splendid New York Times Magazine article published Tuesday examined this argument, as well as Tesla’s ethical approach to balancing the short-term risks and long-term benefits of putting Autopilot on the road.
After years of public debate over whether Tesla’s approach is well-founded or facile, the legal system might finally provide an answer soon.
A correction of note to yesterday’s Data Sheet. I accidentally wrote that Google spent more than $500 billion to acquire DeepMind. That obviously should have been $500 million. My apologies for the error, and thanks to readers who quickly noted the mistake.
Want to send thoughts or suggestions to Data Sheet? Drop me a line here.
Jacob Carpenter