Tesla’s autopilot mode has been linked to 17 fatal crashes in the US. 

A recent analysis of National Highway Traffic Safety Administration (NHTSA) data conducted by The Washington Post has uncovered alarming information about the safety of Tesla vehicles operating in Autopilot mode. 

The investigation revealed that there have been 736 crashes involving Teslas in Autopilot mode since 2019, a significantly higher number than previously reported. 

The crashes, including 17 fatal incidents and five serious injuries, highlight the risks associated with the increasing use of Tesla's driver-assistance technology and the growing presence of these vehicles on American roads.

One such incident occurred in North Carolina's Halifax County, where a Tesla Model Y allegedly in Autopilot mode struck 17-year-old Tillman Mitchell, who had just disembarked from a school bus. Mitchell suffered severe injuries, including a fractured neck and a broken leg. 

While he survived, the incident left him with memory problems and difficulty walking.

The data revealed distinct patterns in Tesla's fatal crashes, with four involving motorcycles and one involving an emergency vehicle. 

Experts suggest that Tesla CEO Elon Musk's decisions, such as expanding the availability of Autopilot features and removing radar sensors from the vehicles, may have contributed to the rise in incidents.

Despite the increasing number of crashes and concerns raised by safety experts, Tesla has not provided any comment regarding the matter. 

NHTSA, on the other hand, says that its investigations into crashes involving driver-assistance technology do not imply that the technology itself was the cause.

The analysis raises questions about the safety and effectiveness of Tesla's Autopilot and full self-driving systems, especially as the company aggressively rolls out the latter feature. 

Experts highlight the need to understand whether the surge in crashes is due to the technology itself or the increased usage of Autopilot.

Critics argue that Tesla's naming of the Autopilot feature can be misleading, as drivers are still required to be fully engaged and in control of the vehicle at all times. 

US Transportation Secretary Pete Buttigieg has expressed concerns about the feature, stating that it is inappropriate to call it Autopilot when it requires constant attention from the driver.

NHTSA continues to investigate these incidents and has opened multiple probes into Tesla's driver-assistance software, focusing on issues such as phantom braking and crashes involving parked emergency vehicles.