Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Leon Poultney

Tesla’s Autopilot system is coming under fire again for its safety record – here’s why

Tesla autopilot.

Despite recent court proceedings ruling in favor of Tesla's Autopilot semi-autonomous driving functionality, a new report has unearthed deeper issues with driver irresponsibility.

In previous court proceedings, the jury agreed that ultimate responsibility for Tesla’s Autopilot and Full Self Driving Beta technology rested with the human behind the wheel.

However, The Washington Post recently obtained exclusive footage of a Tesla crashing through a T intersection at around 70mph and ploughing into a parked vehicle, killing one of its occupants and gravely injuring another in 2019.

According to The Post, it said that in exclusive police body-camera footage obtained by the outlet, the shaken driver says he was "driving on cruise" and took his eyes off the road when he dropped his phone.

Although the most obvious issue at play is driver inattentiveness, The Post is also keen to point out that this incident is one of a handful of fatal or serious crashes involving Tesla Autopilot in road scenarios where it is not intended to be used.

According to research carried out by The Post, Tesla has acknowledged that Autosteer, Autopilot's key feature, is "intended for use on controlled-access highways" with "a center divider, clear lane markings, and no cross traffic". This has been stated in manuals, legal documents and even in communication with federal regulators.

In fact, the use of Autosteer is "particularly unlikely to operate as intended" when used if "visibility is poor (heavy rain, snow, fog, etc) or weather conditions are interfering with sensor operation," according to Tesla’s website.

Hills, roads with curves or those that are "excessively rough", bright light conditions and toll booths could also have the same negative effects on the technology.

Even though the company has the technical ability to limit Autopilot’s availability by geography, it has taken few definitive steps to restrict use of the software, The Post claims.

Whistleblower pours more scorn on Tesla’s tech

(Image credit: Tesla)

In addition to The Post’s independent findings, a former Tesla employee recently opened up to the BBC, stating that he doesn’t believe the Autopilot hardware or software is ready yet.

Lukasz Krupski leaked data, including customer complaints about Tesla's braking and self-driving software, to German newspaper Handelsblatt in May. He was ignored by his employer, so has since turned to the press.

According to the BBC, Mr Krupski said he had found evidence in company data which suggested that requirements relating to the safe operation of vehicles that had a certain level of autonomous or assistive-driving technology had not been followed.

"It affects all of us because we are essentially experiments in public roads. So even if you don't have a Tesla, your children still walk in the footpath," he told the BBC.

According to reporting by The Post, the National Highway Traffic Safety Administration (NHTSA) in the US has yet to act on the issue of Tesla’s Autopilot failings, despite strongly worded advice from its peer agency National Transportation Safety Board (NTSB).

In an interview with The Post earlier this year, NTSB chair Jennifer Homendy said: "If the manufacturer isn’t going to take safety seriously, it is up to the federal government to make sure that they are standing up for others to ensure safety".

"Safety does not seem to be the priority when it comes to Tesla," she added.

You might also like

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.