What an odd thing to say…

  • qarbone@lemmy.world
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    19 hours ago

    This puts a spin on the article (which, admittedly, could have its own spin), that smells disingenuous.

    She wasn’t saying “yeah, those bozos will be fine in our shoddy bots run down grannies on the crosswalk”, in a mask-off moment. The article was saying Waymo expects someone will be fatally struck by one of their vehicles eventually, but society will have accepted (Waymo’s) driverless cars enough by then that it won’t break the company. “They’ll see Waymo is so much safer than normal drivers even if it still does cause some accidents.” type shit.

    It’s still wishful corpo-speak but there’s no reason to mislead.

    Edit: I understand that it is the headline of the article itself but we should do better than regurgitating and echoing clickbait titles.

    • kennedy@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      13 hours ago

      mainstream “journalism” is about rage baiting engagement. Anytime an article has an inflammatory title about what someone says 95% of the time they are being misquoted. In these hundreds of comments I’ve only seen your comment mentioning that. No one questions anything anymore, if its about something they don’t like then it must be true. Even though the futurism article directly links the article its talking about and the full quote/context of what the ceo was saying. I’m not a fan of waymo (and certainly not google’s evil ways) but facts seem to be a distant ancient theory these days. Pitchforks first then think later.

      idk if the author chose that title maybe its futurism itself but a more accurate description would have been something like “our cars are safe but we are also prepared/preparing for when something bad happens”. That doesn’t get clicks tho.

  • yogurt@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    ·
    1 day ago

    Instead of running a red light or hitting a pole self-driving cars drive full speed under a trailer and decapitate everybody, or someone falls against the car and it detects an accident and decides to pull over and slooowly runs over the person and drags them down the street ignoring all the screaming. The kind of accidents society is desensitized to are the ones they taught the car how to avoid, the fucked up shit where somebody gets hydraulically pressed to death in slow motion while 15 people film it on their phones is what Waymo is going to do.

    • veni_vedi_veni@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 day ago

      Atleast with the running over pedestrian scenario, I would think the passengers have a manual way to interrupt program logic/stop.

      Also, you’d best believe truck decapitations happened a lot without self driving, enough to mandate that trailers have those guardrails below their unloading doors.

        • LifeInMultipleChoice@lemmy.world
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          1 day ago

          Yeah, they had to change things, the person was hit by a human driver and flung in the self driving cars path and the human driver drove off. The self driving car didn’t know what to do and dragged the body to the side of the road basically. None of these incidents took place by a Waymo vehicle though. Waymo has had to shoulder the shit that Tesla and other companies have put out. GM as you said making that “mistake”.

      • Nindelofocho@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        1 day ago

        I really dont know why there arent big E stop buttons like on every other large piece of equipment that can severely harm you

        • AxExRx@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 day ago

          Im assuming they wanted to avoid having people get hit from behind when stopped in the middle of the road, hence the whole auto pull over thing.

          But yeah they should still have a kill switch, maybe make it activate the slow and pull over protocol above a certain speed, or dead stop if operating at a slow speed?

    • Tollana1234567@lemmy.today
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      21 hours ago

      developing the fast rail system, at least in california, it was blocked by musk and the gop(elaine chao in trumps 1st term, mitch mcconells wife). cali never tried again.

    • skuzz@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      2
      ·
      21 hours ago

      We will make the most complex convoluted contrivances before laying down steel and locomotives. Funny part I always liked about the I, Robot movie. No, we didn’t have public transport, everyone just has self-driving cars on roads controlled by a centralized AI.

  • verdi@feddit.org
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    1
    ·
    1 day ago

    I think society is ready and eager for CEOs to be hunted like animals, as the United Healthcare case showed.

  • mech@feddit.org
    link
    fedilink
    English
    arrow-up
    195
    arrow-down
    1
    ·
    2 days ago

    one passed a stopped school bus that was unloading kids in Atlanta. That’s a violation that normally garners $1,000 fine and a court hearing, but nothing was issued to the company.

    “These cars don’t have a driver, so we’re really going to have to rethink who’s responsible,” said Georgia state Representative Clint Crowe to Atlanta news station, KGW8.

    No? The company has a mail address. Send them the notice and summons to court, just like you would for the owner of a regular vehicle.

    • Miles O'Brien@startrek.website
      link
      fedilink
      English
      arrow-up
      95
      ·
      2 days ago

      When it’s time for money: COMPANIES ARE PEOPLE TOO!

      When it’s time for punishment: but you can’t hold a company responsible, it’s not just one person.

    • Tinidril@midwest.social
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      1
      ·
      2 days ago

      Probably a waste of time until you review how the law was written. Odds are it just doesn’t apply. It’s a job for lawmakers at this point, not a judge.

      Now, if it hits a kid before the law gets written, a judge would preside over a civil case. There might even be a civil case against the legislature, depending on how that works in the jurisdiction.

      • mech@feddit.org
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        1
        ·
        2 days ago

        I can only speak for German law: When a car breaking a traffic law is identified (by number plate), the registered owner of the car gets sent a letter notifying them and ordering them to identify the driver.
        If the owner can’t or won’t name the driver, the owner has to pay the fine. The law assumes that either you let someone drive the car, then you must know who it was, or at least be able to help the feds in their investigation. Or the car was stolen, but then it was on you to report the theft immediately.
        It does get trickier when it’s a criminal case, cause in Vaymo’s case, it’s difficult to determine who is personally responsible. This is where new laws are required. One possibility would be looking to the data privacy laws: Here, every affected company needs to appoint someone responsible for data privacy. In case of a violation that person is personally responsible and can be punished, including prison time, if they haven’t done their due dilligence.

        So for self-driving cars, every company would need to have a “traffic safety director” who is legally required to be in the loop for all decisions regarding traffic safety, has to report any legal violations to their superiors and the public, and is personally responsible for ongoing gross violations. (It’s a very well-paying job.)

      • mech@feddit.org
        link
        fedilink
        English
        arrow-up
        25
        ·
        2 days ago

        Per infraction. That’ll put a cost on violating traffic laws and incentivize them to fix their software in order to cut cost.
        And if you can prove intent (they were aware of a dangerous bug but chose not to fix it), then ground the fleet until it’s fixed and/or punish whoever’s ultimately responsible, personally.

        • Miles O'Brien@startrek.website
          link
          fedilink
          English
          arrow-up
          8
          ·
          2 days ago

          I propose taking that 1k, measuring against the average income of anyone who makes under 1M, and use that percentage of cost of living to fine the company appropriately.

          Example: 1k fine for someone who makes 10k/yr, that 1k is 10% of their yearly income, whereas a company that makes 10,000,000,000/yr, that’s only 0.0001%

          • mech@feddit.org
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 days ago

            It would make sense to scale it to what one car makes the company, since you’re fining them for a violation done by one car.
            With your suggestion, it would be a lot easier and cheaper for the state to simply ban Waymo, since that would be the result.

            • railway692@piefed.zip
              link
              fedilink
              English
              arrow-up
              1
              ·
              14 hours ago

              No, keep it scaled to the company.

              I’m tired of state leadership taking bribes from big businesses instead fines and taxes.

              Too many have been taking the “easier and cheaper” route for too long, and that’s a big part of why we’re in the capitalist hellscape we’re currently in.

          • Tollana1234567@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            21 hours ago

            fleet also refers to ships, and cars wierdly enough. i would call it an armada, if suddenly hundreds show up in one place.

      • Cocodapuf@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        Sure but if ”just the cost of doing business" becomes their official policy on this variety of traffic incident, they could end up paying $1000 a dozen times a day. That ads up pretty quickly.

        • chaosCruiser@futurology.today
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 days ago

          Weigh that against how much it costs to develop, test, and deploy a fix. If you get fines like that 10 times every day, you could have spent all that money on developer wages and the problem would have been fixed in a month or two. If it’s only one ticket a month, it’s cheaper to leave it as it is.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        Funny how most laws are incentivized to punish the poor, by setting static monetary fines that rich people and corporations would scoff at.

      • webghost0101@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        edit-2
        2 days ago

        If they are truly so much safer, one could say they have much higher standards for drive safety.

        If they have a much higher standards then the times they do fail, it’s reasonable the fine should be multiplied a few times.

        There are some current examples where commercial higher standards lead to bigger penalties.

        Bar owners can be criminally charged for over serving alcohol to drunk clients. Citizen hosts don’t face that same legal responsibility.

        Similar with Financial advisors vs your crypto uncle.

        • mech@feddit.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 days ago

          If they have a much higher standards then the times they do fail, it’s reasonable the fine should be multiplied a few times.

          I was only fined once in my life for speeding (going 5km/h over the limit on a downhill), since I always respect the speed limit.
          Would it make sense to multiply my fine by the average number of violations other people commit in their lives?

          • webghost0101@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            Then not what I said,

            Neither do you as a private citizen qualify to be held to a higher standard like the real world examples I gave.

  • pedz@lemmy.ca
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    5
    ·
    2 days ago

    Not that odd. Death by car is easily accepted by society. They are “accidents” and a “necessary evil” for society to function.

    There’s around a million people dying from cars every year and we just shrug and normalize them. Human or not, we just have to have cars and “accidents” are just that.

    According to the World Health Organization (WHO), road traffic injuries caused an estimated 1.35 million deaths worldwide in 2016. That is, one person is killed every 26 seconds on average.

    Nobody cares about cars killing people and animals. So she’s probably right.

    • AwesomeLowlander@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      4
      ·
      edit-2
      2 days ago

      More so when you take her actual statement in context: that they’re actually reducing deaths by being safer. The comments on lemmy are turning out to be just as biased and ungrounded in reality as they were on Reddit.

      Waymo robotaxis are so safe that, according to the company’s data, its driverless vehicles are involved in 91 percent fewer crashes compared to human-operated vehicles.

      And yet the the company is bracing for the first time when a Waymo does kill somebody — a moment its CEO says society will accept, in exchange for access to its relatively safer driverless cars.

      • pedz@lemmy.ca
        link
        fedilink
        English
        arrow-up
        16
        ·
        2 days ago

        However I’m pretty sure that a standard transit system not made up of single cars that can only transport one or two person at a time and spy on them is also much safer.

      • P03 Locke@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        1 day ago

        Waymo robotaxis are so safe that, according to the company’s data, its driverless vehicles are involved in 91 percent fewer crashes compared to human-operated vehicles.

        Wow, you think the “company’s data” is a trustworthy source? Where is your critical thinking skills?

          • P03 Locke@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            3
            ·
            edit-2
            1 day ago

            If the data is falsified that’d be illegal.

            Oh no! It would be illegal!

            And what would be the punishment if it was found out that they released illegal data? A fine that could amount to hundreds of thousands of dollars? On top of their tens of millions of dollars of profits?

            Do you have a reason to think otherwise?

            Yes, they are directly incentivized to either push their data in a biased direction or outright falsify their numbers, in order to facilitate the marketing strategy of these taxis being a “safe” technology, and increase their profit margin.

            Fuck… have we learned nothing from the tobacco industry?!

    • P03 Locke@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 day ago

      There’s around a million people dying from cars every year and we just shrug and normalize them. Human or not, we just have to have cars and “accidents” are just that.

      The difference is accountability. If a human kills another human because of a car accident, they are liable, even criminally liable, given the right circumstances. If a driverless car kills another human because of a car accident, you’re presented with a lose-lose scenario, depending on the legal implementation:

      1. If the car manufacturer says that somebody must be behind the wheel, even though the car is doing all of the driving, the person is suddenly liable for the accident. They are expected to just sit there and watch for a potential accident, but the behavior of what an AI model will do is undefined. Is the model going to stop in front of that passenger as expected? How long do they wait to see before they take back control? It’s not like cruise control, a feature that only controls part of the car, where they know exactly how it behaves and when to take back control. It’s the equivalent of asking a person to watch a panel with a single red light for an hour, and push a button as fast as possible when it blinks for a half-second.

      2. If the model is truly driverless (like these taxis), then NOBODY is liable for the accident. The company behind it might get sued, or might end up in a class-action lawsuit, but there is no criminal liability, and none of these lawsuits will result in enough financial impact to facilitate change. The companies have no incentive to fix their software, and will continue to parrot this shitty line about how it’s somehow better than humans at driving, despite these easily hackable scenarios and zero accountability.

      Humans have an incentive to not kill people, since nobody wants to have that on their conscience, and nobody wants to go to prison over it.

      Corporations don’t. In fact, they have an incentive to kill people over profits, if the choice presents itself!

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      edit-2
      1 day ago

      Nobody cares about cars killing people and animals.

      I think that’s overstating it a bit, of course many care, and we have people who are responsible for setting safety standards.
      Just because accidents are unavoidable doesn’t mean we aren’t trying to minimize them and avoid fatalities.

      Mandatory safety belts is an example of this. But other than that there are actual scientific studies into road safety, and even city wide implementations of such studies. At least in Europe there is, but I’m guessing USA has it too.

      Just because traffic accidents happen, and we obviously need “traffic” to be able to move around, doesn’t mean nobody cares.

      As an anecdotal example, here (Denmark) the speed limit was increased from 110 to 130 on our equivalent to Autobahn, which may seem like accepting more accidents for convenience or efficiency. But in reality it was to divert more traffic to the safer “Autobahn” to actually reduce the number of accidents on smaller roads.

      Traffic safety is as much about psychology as it is about making safer systems.

      PPS:
      Regarding the animals we have just had warnings about deer, and some places have small tunnels made for frogs.
      And there are warning signs where deer tend to cross in almost any country that has them.

    • But_my_mom_says_im_cool@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      2 days ago

      Self driving cars will have far less accidents and deaths than human driven cars. But the idea of being killed by human error is acceptable to us but the idea of a machine fucking up and killing us is terrifying, even if it means one self driving accident will create algorithms to avoid that same incident on all cars. Whereas human error can happen over and over in the same situation