• @jet@hackertalks.com
    link
    fedilink
    English
    811 months ago

    This is a discussion forum, so I’m discussing. I’m not citing sources as you have twice noted.

    Like it or not. People are going to associate any Tesla crash with the failure of Elon musk’s assisted driving system. Even if we look at a very sensible market participant like waymo, any waymo vehicle incident will be associated with the self-driving nature of the car. This is normal for any novel technology. All the downsides get associated with the novelness.

    It certainly my hope it’s statistically any issues that arise from automated driving are going to be less likely than issues arriving arising from human driving, especially intoxicated driving… Until we get to that point where everyone knows that, we’re going to have media that’s associated with the downsides.

    • @Aurenkin@sh.itjust.works
      link
      fedilink
      English
      011 months ago

      On that we can absolutely agree and I think scrutiny is definitely warranted with any new technology especially one which has such a huge profit motive. My issue in this case was with the original claim that the system intentionally disengages at the last minute for the purpose of avoiding liability for any crash. Big call.

      Anyway, I was probably overly sarcastic and flippant which doesn’t help my point so sorry for venting my frustrations like that. Hopefully these technologies get the scrutiny they deserve without hysteria any time there’s a crash that ‘possibly’ involved autopilot.

      • @jet@hackertalks.com
        link
        fedilink
        English
        1
        edit-2
        11 months ago

        I don’t think that’s the main reason autopilot hands over when it’s about to crash, but I think that is a factor that was part of the design.

        I think a lawyer definitely was consulted during the design of assisted driving to human driver hand off. Can I cite sources no. It’s just sensible. If you were designing a system, that involved life and death decisions, you would have lawyers involved. Any good lawyer would help you limit your liability by moving the decision making to the human when something was about to go wrong.

        https://www.youtube.com/watch?v=ZBvIWFq-fGc Are drivers like this ready to take over in an emergency in less than a second? No. Elon musk does no favors to his system by calling it fully automated driving. Or whatever the term is. Which is misleading. Driver assistance should be assistance, but the more you take the driver out of the loop the more they get distracted the more they are not in the right context to jump in. That’s human nature. So there’s going to be a balance we have to find between automated hands-off driving and humans being responsible. I don’t think Tesla’s found that right balance.

        And I 100% believe lawyers are involved to limit liability at least so that statements can be made but self-driving system was not at fault for the car crash. It was not engaged at the time of the crash. 100% believe that was a factor in their handover logic. I can’t prove it. But the preponderance of evidence, the public behavior of certain market leaders, and my history with corporations. Does not make this a big leap of faith