At some point we have to accept vehicular deaths given how car-centric our society is and how distracted and unsafe a lot of drivers have become.
Normal taxi drivers kill people.
Normal truck drivers kill people.
Normal home to work drivers kill people.
If a robotic taxi can lower the taxi category of accidents by 91% across the board, including death rates, then that’s a positive improvement to society any way you slice it. Not saying it isn’t a horrifying dystopian world we’re potentially building, but at the moment, given the numbers, it would be 91% safer in that category.
The ultimate solution is to shift towards more public transit options in general, and away from individual vehicular transport. Not only is it a massive burden to the environment, but it’s a massive cost burden to the individuals and society as a whole.
Fine, but if I kill someone with a car I would expect consequences. If their car kills somebody, they are held responsible. They pay, and if they are at fault, it is a criminal matter.
I suspect this ceo isn’t saying ‘the public is ready for our cars to kill someone, and we don’t suffer any consequences, because they seem to be cool with some of them being killed so that we can profit.’
Mind you, look at how many people Musk’s cars are killing (including quite a few burned alive) and he suffers no consequences so fuck it. Just keep your capitalist, corporatist, and car culture over there, thanks.
If a robotic taxi can lower the taxi category of accidents by 91% across the board, including death rates, then that’s a positive improvement to society any way you slice it. Not saying it isn’t a horrifying dystopian world we’re potentially building, but at the moment, given the numbers, it would be 91% safer in that category.
You need to prove this number. Looking at the behavior of current driverless cars, the software is still shit, and nothing has reached Level 5 Autonomous Driving. There are too many edge cases, and conflicting behavior points. Navigating a world of humans driving in different ways with complex urban and rural streets is a very very messy affair.
Hell, nobody in the space can even answer this simple question correctly: If the speed limit is 55 MPH on the highway, and everybody is going 65 MPH, and we know that the delta of speed is what kills people in highway car accidents, what speed does the driverless car use?
i think people are much worse drivers than you think they are… you just hear about every self driving accident because it’s newsworthy right now
apparently
Self-driving cars are more than twice as likely to be involved in an accident compared to human-driven cars, but some studies suggest they are considerably less injurious (and fatal) than human-operated vehicle crashes.
If a robotic taxi can lower the taxi category of accidents by 91% across the board, including death rates, then that’s a positive improvement to society any way you slice it.
The “if” in this sentence is a load bearing word.
With today’s crew running the policy, I don’t think anyone will prevent corporations from unleashing completely unsafe robotic taxis on the public that’ll perform well worse than regular ones. I really wish people would stop making this argument to the corporation’s benefit until we have some data backing it up.
I get that there’s a theoretical possibility that still imperfect robotic taxis could outperform humans, but that’s just theoretical.
With the way corporate accountability is handled (i.e., corporations aren’t held accountable) nowadays, I just don’t see robotic taxis as much more than an accountability sink and at this point I’d prefer taking regular taxis because at least there is someone to fucking hold accountable when things go wrong.
Except there’s a difference between a machine killing somebody because it was programmed to and a person killing somebody on accident. One of those things has people making decisions who are not going to be held responsible.
The other problem is it creates openings for malicious actors: if your government (or even Saudi Arabia, or Israel) for instance wanted to kill a political dissident they could add a self erasing line of code to a car to run over a specific person.
This is why self driving laws need to be explicit about how they’re approaching this otherwise you’re inviting in a lot of suspicious behavior by amoral companies. There needs to be safeguards on how and who has access to self driving code.
I would say that source code for any self driving or autonomous machine in a public street should be held by insurance companies or a third party who performs regular validation checks on vehicle codes (which could be read and validated at charging stations or gas stations) and it should only be edited by publicly licensed software engineers whose licenses can be revoked for bad behavior.
Anything less is inviting a series of predictable public safety fiascos.
At some point we have to accept vehicular deaths given how car-centric our society is and how distracted and unsafe a lot of drivers have become.
Normal taxi drivers kill people.
Normal truck drivers kill people.
Normal home to work drivers kill people.
If a robotic taxi can lower the taxi category of accidents by 91% across the board, including death rates, then that’s a positive improvement to society any way you slice it. Not saying it isn’t a horrifying dystopian world we’re potentially building, but at the moment, given the numbers, it would be 91% safer in that category.
The ultimate solution is to shift towards more public transit options in general, and away from individual vehicular transport. Not only is it a massive burden to the environment, but it’s a massive cost burden to the individuals and society as a whole.
Fine, but if I kill someone with a car I would expect consequences. If their car kills somebody, they are held responsible. They pay, and if they are at fault, it is a criminal matter.
I suspect this ceo isn’t saying ‘the public is ready for our cars to kill someone, and we don’t suffer any consequences, because they seem to be cool with some of them being killed so that we can profit.’
Mind you, look at how many people Musk’s cars are killing (including quite a few burned alive) and he suffers no consequences so fuck it. Just keep your capitalist, corporatist, and car culture over there, thanks.
I agree, the consequences should be severe.
With that said, airlines kill people and all it chiefly results in is a fine to act as a disbursement to the families.
Yeah, and I remember personally calling out how Boeing’s policy of outsourcing at least some software development for the 737 Max to Indians was outrageous as it carried obvious and unacceptable risk: https://www.industryweek.com/supply-chain/article/22027840/boeings-737-max-software-outsourced-to-9-an-hour-engineers
A couple of them resigned - no doubt with a very nice golden parachute. I guess it might go to court eventually. What a shitshow.
You need to prove this number. Looking at the behavior of current driverless cars, the software is still shit, and nothing has reached Level 5 Autonomous Driving. There are too many edge cases, and conflicting behavior points. Navigating a world of humans driving in different ways with complex urban and rural streets is a very very messy affair.
Hell, nobody in the space can even answer this simple question correctly: If the speed limit is 55 MPH on the highway, and everybody is going 65 MPH, and we know that the delta of speed is what kills people in highway car accidents, what speed does the driverless car use?
(Hint: the correct answer is not 55.)
i think people are much worse drivers than you think they are… you just hear about every self driving accident because it’s newsworthy right now
apparently
https://financebuzz.com/self-driving-car-statistics-2025
not a primary source, but their data seems to be from the NHSTA
yeah… very much public health attitude
Watch “Upload” on prime. Literally about this.
The “if” in this sentence is a load bearing word.
With today’s crew running the policy, I don’t think anyone will prevent corporations from unleashing completely unsafe robotic taxis on the public that’ll perform well worse than regular ones. I really wish people would stop making this argument to the corporation’s benefit until we have some data backing it up.
I get that there’s a theoretical possibility that still imperfect robotic taxis could outperform humans, but that’s just theoretical.
With the way corporate accountability is handled (i.e., corporations aren’t held accountable) nowadays, I just don’t see robotic taxis as much more than an accountability sink and at this point I’d prefer taking regular taxis because at least there is someone to fucking hold accountable when things go wrong.
also, who’s getting the most injured? pedestrians, or occupants?
if the net rate of injuries increases among a vulnerable group, that is not okay
Except there’s a difference between a machine killing somebody because it was programmed to and a person killing somebody on accident. One of those things has people making decisions who are not going to be held responsible.
The other problem is it creates openings for malicious actors: if your government (or even Saudi Arabia, or Israel) for instance wanted to kill a political dissident they could add a self erasing line of code to a car to run over a specific person.
This is why self driving laws need to be explicit about how they’re approaching this otherwise you’re inviting in a lot of suspicious behavior by amoral companies. There needs to be safeguards on how and who has access to self driving code.
I would say that source code for any self driving or autonomous machine in a public street should be held by insurance companies or a third party who performs regular validation checks on vehicle codes (which could be read and validated at charging stations or gas stations) and it should only be edited by publicly licensed software engineers whose licenses can be revoked for bad behavior.
Anything less is inviting a series of predictable public safety fiascos.