• Thorny_Thicket@sopuli.xyz
    link
    fedilink
    arrow-up
    16
    ·
    před 1 rokem

    With Tesla the complaint is that the statistics are almost all highway miles so it doesn’t represent the most challenging conditions which is driving in the city. Cruise then exclusively drives in a city and yet this isn’t good enough either. The AV-sceptics are really hard to please…

    You’ll always be able to find individual incidents where these systems fail. They’re never going to be foolproof and the more of them that are out there the more news like this you’re going to see. If we reported about human-caused crashes with the same enthusiasm that would be all the news you’re hearing from then on and letting humans drive would seem like the most scandalous thing imaginable.

      • Thorny_Thicket@sopuli.xyz
        link
        fedilink
        arrow-up
        6
        ·
        před 1 rokem

        Humans get into accidents all the time. Is that not unacceptable for you?

        I feel like people apply standards to self driving cars that they don’t to human driven ones. It’s unreasonable to expect a self driving system never to fail. It’s unreasonable to imagine you can just let it practice in simulation untill it’s perfect. This is what happens when you just narrowly focus on one aspect of self driving cars (individual accidents) - you miss the big picture.

          • Thorny_Thicket@sopuli.xyz
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            před 1 rokem

            Tesla on FSD could easily pass the driving test that’s required for humans. That’s a nonsensical standard. Most people with fresh license are horribly incompetent drivers.

                • sky@codesink.io
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  před 1 rokem

                  Have you used it? It’s not very good. It tries to run red lights, makes random swerves and inputs, and generally drives like someone on sedatives.

                  They’ve had to inject a ton of map data to try to make up for the horrendously low resolution cameras, but “HD MaPs ArE a CrUtCh” right?

                  No radar or lidar means the sun can blind it easily, and there’s a blind spot in front of the car where cameras cannot see.

                  Is what they’ve made impressive? Sure, but it’s nowhere near safe enough to be on public roads in customer’s cars. At all.

          • abhibeckert@beehaw.org
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            před 1 rokem

            I don’t expect them to never fail, I just want to know when they fail and how badly.

            “Over 6.1 million miles (21 months of driving) in Arizona, Waymo’s vehicles were involved in 47 collisions and near-misses, none of which resulted in injuries”

            How many human drivers have done millions of miles of driving before they were allowed to drive unsupervised? Your assertion that these systems are untested is just wrong.

            “These crashes included rear-enders, vehicle swipes, and even one incident when a Waymo vehicle was T-boned at an intersection by another car at nearly 40 mph. The company said that no one was seriously injured and “nearly all” of the collisions were the fault of the other driver.”

            According to insurance companies, human driven cars have 1.24 injuries per million miles travelled. So, if Waymo was “as good as a typical human driver” then there would have been several injuries. They had zero serious injuries.

            The data (at least from reputable companies like Waymo) is absolutely available and in excruciating detail. Go look it up.

      • anlumo@feddit.de
        link
        fedilink
        arrow-up
        4
        ·
        před 1 rokem

        As a software developer, that’s not how testing works. QA is always trying to come up with weird edge cases to test, but once it’s out in the wild with thousands (or more) of real-world users, there’s always going to be something nobody ever tried to test.

        For example, there was a crash where an unmarked truck with exactly the same color as the sky was 90° sideways on the highway. This is just something you wouldn’t think of in lab conditions.

          • abhibeckert@beehaw.org
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            před 1 rokem

            And a thing blocking the road isn’t exactly unforeseen either.

            Tesla’s system intentionally assumes “a thing blocking the road” is a sensor error.

            They have said if they don’t do that, about every hour or so you’d drive past a building and it would slam on the brakes and stop in the middle of the road for no reason (and then, probably, a car would crash into you from behind).

            The good sensors used by companies like Waymo don’t have that problem. They are very accurate.