• @skymtf@lemmy.blahaj.zone
    link
    fedilink
    English
    10310 months ago

    I feel like the NTSB need to draft a min spec for self driving cars and a testing course that involves some of the worst circtimstances to get approved. I feel like all self driving cars should have to have lidar, and other sensors. Computer vision really isn’t working out.

    • @echo64@lemmy.world
      link
      fedilink
      English
      5510 months ago

      You build a benchmark and tesla will train on that benchmark, says nothing about real world use but gets them signed off.

      But yes western society is currently in a hellscape of refusing to do even basic regulation of any new technology so it’ll probably be a good 20 years of murder robots on the streets before anything gets written down.

      • @FoxBJK@midwest.social
        link
        fedilink
        English
        3810 months ago

        By “western society” do you mean the US? Because the EU doesn’t seem to have any qualms about regulating new technologies. That seems to be a uniquely American thing.

        • @DarthBueller@lemmy.world
          link
          fedilink
          English
          610 months ago

          Which somehow means that Europeans suddenly have headlights that makes sense while we’re over here dying from aftermarket HIDs that should be treated like the VA Highway Patrol treats radar detectors ( rip ‘em out and smash them with a sledgehammer on the side of the road)

      • ayaya
        link
        fedilink
        English
        12
        edit-2
        10 months ago

        To be fair we already have giant metal murder boxes zooming around on the streets. If AI kills even a single person everyone flips out even though over 40,000 people die every year in the US from car accidents. And that is just the deaths, not including injuries. Yet I don’t really see anyone calling for more regulations on driving tests for humans.

        People want AI to somehow be perfect when in reality as long as AI is even 1% better than humans that’s saving over 400 lives per year. AI doesn’t get sleepy, distracted, drunk, etc. so it probably already is at least 1% better in most situations. Humans are horrible drivers.

      • originalucifer
        link
        fedilink
        1010 months ago

        But yes western society is currently in a hellscape of refusing to do even basic regulation

        US regulations are only written in blood or money. the united states was built on the backs of slaves, and then wage-slaves. literal graveyards filled with workers.

        im not disagreeing with you, i just found this comically disparate to history… ie, its always been a regulation hellscape.

      • @NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        510 months ago

        But yes western society is currently in a hellscape of refusing to do even basic regulation

        Only the Usamerican country.

        We Europeans are scratching our heads already for very long: why are they letting these guys do just everything they want?

        • @echo64@lemmy.world
          link
          fedilink
          English
          310 months ago

          Not really. The eu does more than most western nations, but it’s generally things that get regulated ten years too late and only a tiny amount compared to what society actually needs. So again, better, massively lax compared to need and comparisons to other periods

    • @nxfsi@lemmy.world
      link
      fedilink
      English
      3310 months ago

      I don’t think mandating lidar specifically by name is right, seeing as computer vision is definitely a software problem. Instead they should mandate some method to detect objects in any light condition + a performance standard, which in practice during certification could mean lidar. Regulations should be as minimal and specific as possible.

      • @GenderNeutralBro
        link
        English
        2510 months ago

        Good point. Mandate the ends rather than the means. If they get better functionality with some new tech in a few years, we don’t want outdated regulations holding the industry back.

      • @NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        9
        edit-2
        10 months ago

        computer vision is definitely a software problem.

        No, it isn’t.

        If it were only software, don’t you think Tesla should be the best of them all, being the pure software shop they are?

        But it is a real world problem. Recognizing real objects in real world conditions like weather, natural and artificial lights, temperatures (want some ice on your camera?), winds & storms, all kinds of unforeseen circumstances, other bad drivers, police and firemen…

        And that’s why that pure software shop is so bad at it, while all the real carmakers shrug… they are used to it since forever.

        • @zurohki@aussie.zone
          link
          fedilink
          English
          310 months ago

          You can be the best in the world and still not be good enough.

          Driving a car around using a dozen cameras pointing in every direction isn’t something that’s fundamentally impossible. We just can’t do it yet.

    • @SuperSleuth@lemm.ee
      link
      fedilink
      English
      1210 months ago

      Should a self-driving car face more rigorous tests than actual human drivers? Honest question.

        • stopthatgirl7
          link
          fedilink
          810 months ago

          Yes, because when there’s an accident with a person driving, you usually know exactly who is legally to blame in an accident. With self-driving, if the car accidentally hits and kills someone, who do you charge for it? There’s no one person you can point to for responsibility for if something goes wrong, like you can for a person responsible for an accident.

      • @FoxBJK@midwest.social
        link
        fedilink
        English
        1510 months ago

        Human drivers should be facing more rigorous testing regardless. It’s horrifically easy to get a license… and then they never test you again for the rest of your life. That’s just insane when you think about it. My test was in 2002. Feels like I should have to retake it at some point.

        • @TenderfootGungi@lemmy.world
          link
          fedilink
          English
          410 months ago

          And take them away for bad driving. But we don’t because our entire transportation infrastructure, outside of a few cities namely NY, is built around everyone driving a car.

      • @IphtashuFitz@lemmy.world
        link
        fedilink
        English
        1310 months ago

        Yes. A human brain can handle edge cases it’s never encountered before. Can a self driving car?

        • Ever stop at a red light only to have a police officer wave you through?

        • Ever encounter a car driving the wrong way down a one way street?

        • Ever come across a flooded out stretch of road? (if the road has no lines and the water is still it can be very deceptive looking)

        These are a tiny number of things I’ve encountered over the past few years. I’m sure plenty of other drivers can provide other good examples. I’d want to know how a self driving car would handle itself in situations like these.

      • snooggums
        link
        fedilink
        510 months ago

        Yes because each person must learn on their own and have limited experience relative to the general public as a whole.

        Self driving cars can ‘learn’ from all self driving cars and don’t get tired, forget, or anything like that. While they shouldn’t be held to perfection, they should absolutely be held to a higher standard than a human.

      • @NeoNachtwaechter@lemmy.world
        link
        fedilink
        English
        410 months ago

        Should a self-driving car face more rigorous tests than actual human drivers? Honest question

        First: none of these automated cars would pass a German driver’s license test. By far.

        Second: of course you cannot compare tests for humans with tests for machines.

      • @nxfsi@lemmy.world
        link
        fedilink
        English
        -110 months ago

        Only Tesla self driving cars need to have more rigorous tests. Other brands are fine as it is because they have lidar.

        • @IphtashuFitz@lemmy.world
          link
          fedilink
          English
          610 months ago

          LiDAR isn’t some sort of magic eye. The self driving system is only as good as the software that takes the inputs from cameras, LiDAR, etc., processes them, and ensures safe operation of the car.

          • @nxfsi@lemmy.world
            link
            fedilink
            English
            110 months ago

            Finally someone who actually uses critical thinking instead of being an anti-Elon bandwagoner.

        • @skymtf@lemmy.blahaj.zone
          link
          fedilink
          English
          210 months ago

          I feel like all them do, have you seen wayze nearly getting black people killed cause it didn’t stop for s cop. And it can’t recognize construction zones.

        • @sky@codesink.io
          link
          fedilink
          English
          -110 months ago

          Five LiDAR sensors hasn’t stopped Cruise from running into a bus, multiple cars, and a fire truck. Maybe self-driving is a myth?

          Maybe we should just build buses and trains and pay people good salaries to operate them??

    • @Cheers@sh.itjust.works
      link
      fedilink
      English
      510 months ago

      Throw I some pot holes and child pedestrian crossing the street, etc and they’d even come out with a powerful marketing ad.

    • @tony@lemmy.hoyle.me.uk
      link
      fedilink
      English
      310 months ago

      Pretty much what the UNECE did… there are standards for these things. Tesla doesn’t meet them, which is why FSD ‘beta’ is still ‘seeking regulatory approval’ in the rest of the world.