Tesla and NHTSA Under Fire: New Investigation Alleges Cover-Up of Autopilot Crashes
According to a new inquiry of Tesla’s contentious Autopilot, the National Highway Traffic Safety Administration (NHTSA) and the US manufacturer have been withholding information concerning incidents that have included the system. The Wall Street Journal’s study into over 200 of these occurrences showed that the Autopilot system has trouble navigating barriers and has been shown to have caused many instances of automobiles to drift off the road while the system was activated.
According to the WSJ article, Tesla and the NHTSA have withheld crucial information about the events that have been recorded, including as the date of the occurrences and the crash story. Elon Musk, the CEO of the massive electric vehicle company, has stated that this data is exclusive company knowledge and has not disclosed any further information. The NHTSA has said that they are unable to give specific details on these crashes because of their legal duty to protect individual privacy under US federal law.
A recent report from the Wall Street Journal has brought to light several incidents related to Tesla’s controversial Autopilot system. The investigation delved into crash data, federal filings, and police records associated with over 200 Autopilot-related accidents. Here are the key findings:
1. Obstacle Struggles and Veering Off the Road
In these accidents, Tesla’s Autopilot system reportedly encountered difficulties with obstacle avoidance. There were instances where vehicles veered off the road while Autopilot was engaged. The investigation suggests that the system faced challenges in handling obstacles effectively.
2. Hidden Information
The report alleges that both Tesla and the National Highway Traffic Safety Administration (NHTSA) withheld crucial information regarding these crashes. Specific details, including crash narratives and incident dates, were kept confidential. Tesla cited this data as proprietary business information, while the NHTSA invoked privacy protection under federal law.
3. Avoidable Crashes
A discernible pattern of avoidable crashes emerged from the data. Some Tesla vehicles collided with emergency vehicles displaying flashing lights, while others ran off the road at T-junctions. However, the report does not definitively attribute these collisions solely to the way Autopilot uses cameras and onboard software to detect obstacles.
4. Expert Access Required
To obtain a comprehensive understanding of what the cameras record and how they process images, expert access to the vehicle’s computer is necessary. Tesla emphasizes that even when Autopilot is engaged, drivers must keep their hands on the wheel and be ready to take control.
5. Broader Context
Federal authorities have linked Tesla’s Autopilot to hundreds of collisions, including fatalities and serious injuries. The NHTSA previously investigated Autopilot’s safety and compelled Tesla to update its driver recognition programming.
In summary, this investigation raises critical questions about transparency, safety, and the responsibilities of both Tesla and regulatory bodies. As the debate continues, Autopilot technology remains under intense scrutiny.
Remember, safety should always be the top priority when developing and using autonomous driving systems. Stay informed and drive responsibly!