A 2025 Tesla Model 3 in Full-Self Driving mode drives off of a rural road, clips a tree, loses a tire, flips over, and comes to rest on its roof. Luckily, the driver is alive and well, able to post about it on social media.

I just don’t see how this technology could possibly be ready to power an autonomous taxi service by the end of next week.

  • Buffalox@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    9 hours ago

    For many years the “supervised” was not included, AFAIK Tesla was forced to do that.
    And in this case “supervised” isn’t even enough, because the car made an abrupt unexpected maneuver, instead of asking the driver to take over in time to react.

    • FreedomAdvocate@lemmy.net.au
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      3
      ·
      5 hours ago

      The driver isn’t supposed to wait for the car to tell them to take over lol. The driver is supposed to take over when necessary.

      • Buffalox@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        18 minutes ago

        No if you look at Waymo as an example, they are actually autonomous, and stop to ask for assistance in situations they are “unsure” how to handle.

        But even if you claim was true, in what way was this a situation where the driver could deem it necessary to take over? It was clear road ahead, and nothing in view to indicate any kind of problem, when the car made a sudden abrupt left causing it to roll upside down.

      • SkyezOpen@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 hours ago

        The attention required to prevent these types of sudden crashes negates the purpose of FSD entirely.