もっと詳しく

National Transportation Safety Board Chairwoman Jennifer Homendy has called on Tesla to change the design of its advanced driver assistance system to ensure it cannot be misused by drivers, according to a letter sent to the company’s CEO Elon Musk.

The letter, which TechCrunch has viewed, expressed concern that Tesla has yet to implement two safety recommendations that the NTSB issued more than four years ago. The urgency to address those safety recommendations has increased now that Tesla is rolling out more automated driving functions through its so-called “Full Self-Driving” software beta.

“Our crash investigations involving your company’s vehicles have clearly shown that the potential for misuse requires a system design change to ensure safety,” Homendy wrote. Tesla has not responded to request for comment.

The NTSB can only make recommendations and does not have the authority to enforce existing laws or make policy.

While Homendy did thank Tesla for cooperating with NTSB investigators following various crashes and incidents that the agency has examined, she spent the bulk of the letter addressing her deep concerns on Tesla’s inaction to “implement critical NTSB safety recommendations.”

One excerpt:

You have stated that “safety is always the primary design requirement for a Tesla.” Now that statement is undercut by the announcement that Tesla drivers can request access to “Full Self-Driving Beta technology,” operational on both highways and city streets, without first addressing the very design shortcomings that allowed the fatal Williston, Delray Beach, and Mountain View crashes to occur.

If you are serious about putting safety front and center in Tesla vehicle design, I invite you to complete action on the safety recommendations we issued to you four years ago.

The NTSB has long advocated for implementation of myriad technologies to prevent tragedies and injuries and save lives on our nation’s roads, but it’s crucial that such technology is implemented with the safety of all road users foremost in mind. I look forward to receiving an update on our safety recommendations.

In 2017, NTSB issued two safety recommendations to the automaker based on its investigation of a fatal crash, in which Joshua Brown was killed when his Tesla Model S sedan struck a tractor-trailer that crossed his path. Tesla’s advanced driver assistance system, known as Autopilot, was engaged at the time. The agency found that Brown was using Autopilot on roadways that the system was not designed to handle and that he went extended periods of time without having his hands on the wheel. Autopilot is not a hands-free system.

NTSB determined that Tesla’s Autopilot system did not effectively monitor and respond to the driver’s interaction with the steering wheel to ensure driver engagement. The agency recommended Tesla establish safeguards that would limit Autopilot to conditions for which they were designed and to develop ways to more “effectively sense the driver’s level of engagement and alert the driver when engagement is lacking while automated vehicle control systems are in use.”

Tesla has argued that operational design domain limits are not applicable for so-called Level 2 driver assist systems, such as Autopilot, because the driver determines the acceptable operating environment. Homendy pushed back on that argument in her email to Musk, noting that the agency’s crash investigations have “clearly shown that the potential for misuse requires a system design change to ensure safety.”

Homendy also noted that it sent five other automakers that have vehicles with Level 2 driving automation systems a recommendation to apply ways to engage and alert the driver. Those five manufacturers responded to NTSB and described the action they planned to take or were taking to better monitor a driver’s level of engagement, according to Homendy’s letter to Musk.

“Tesla is the only manufacturer that did not officially respond to us about the recommendation,” she wrote.