• Norah (pup/it/she)@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    3
    ·
    3 days ago

    There’s more. Two years prior, the NHTSA had flagged something strange – something suspicious. In a separate report, it documented 16 cases in which Tesla vehicles crashed into stationary emergency vehicles. In each, autopilot disengaged “less than one second before impact” – far too little time for the driver to react. Critics warn that this behaviour could allow Tesla to argue in court that autopilot was not active at the moment of impact, potentially dodging responsibility.

    Gonna add another reply here, finally got back and finished the article. Am I the only one that doesn’t find this particularly suspicious? Hanlon’s razor right, never attribute to malice that which is adequately explained by stupidity. I think the FSD is just terrible, and other reports do say that it disengages and alerts the driver when it encounters situations it can’t handle. Surely any judge worth their salt would come down hard if Tesla tried to shirk responsibility, or it wouldn’t make a difference because driver’s are supposed to maintain active ability to take control?