Hacker News new | past | comments | ask | show | jobs | submit login

I often see logic along the lines of "if autonomous vehicles are safer than people, then let's deploy them" and that logic straight-up does not fly in the real world. In the real world, among the techno-pessimists, that is an outrageously low bar. Most accidents and deaths that happen because of human drivers occur because the driver was doing something illegal, which means the bar that we have for even humans is higher than "safer than human drivers" -- like, we wouldn't allow someone to be an Uber or Lyft or truck driver if they were candid and said "I'm going to text on my phone, and be drunk driving and sleepy and distracted as often as the average motorist".

Also, I feel like there's a lot of talking past one another in these conversations because one person will say "Let's see an autonomous truck shipping hazmat to Pittsburgh in February with freeway lanes shut down" and another person might say "that's a rare instance" but I really don't feel like society will accept anything other than trucks / vehicles that are able to operate under all conditions, with greater safety than the safest human driver. We tolerate human failures but to use them as the benchmark for autonomous systems would be perceived as unethical, because autonomous systems are deliberately designed and any failures by them would be seen as an intentional oversights and errors, and no one at Waymo or Tesla or where ever is ever going to be charged with vehicular manslaughter for an autonomous vehicle error. We'd demand a way higher standard because these companies don't really have any skin in the game, except for financial penalties which we now understand is not a deterrent for anything. My observations are only moderately related but I'm anticipating the same well-trod talking points coming up and want to address them.




I agree achieving human safety equivalent is the minimum bar. Ex: We can all agree that if your system is below human safety, it is definitely unacceptable.

That’s not the argument being presented though. For example Waymo claims to exceed human performance by a large margin: https://waymo.com/blog/2023/12/waymo-significantly-outperfor...

(Again, one may disagree about the methodology or the conclusions of the study. Just want to point out it’s not the argument being presented.)


But issue is that human safety is kind of long tailed. Eg more than a third of all fatal crashes are DUI. I’m guessing if you take out high risk behaviours then the rate will be an order of magnitude lower. What we care about is not being better then an average driver (which basically is bad due to high risk individuals), but better than median driver.


I dunno. If you got all the duo drivers to automatically kick in self driving, then a ton of the most dangerous drivers would be much less dangerous


I think better automated collision avoidance as well as automated “you’re driving erratic take a rest in a safe ___location, we can help get you there” are clear wins. But forced autopilot is definitely not there yet and would require a lot of improvement over average driver (because I don’t want to increase my personal risk in order to decrease risk of high risk individual).


Yes but this misses it, autopilot being better than a drunk driver but worse than the median driver isn't enough unless they only use it when they were going to drive drunk which is kinda a political nonstarter. Otherwise the overwhelmingly sober usage of the feature makes it a net loss in driver quality weighted by miles driven.


The problem is that it is better than average human performance but it needs to be better than specific human performance for any human to be incentivized to switch.

For now what you could do is to demand that anybody that has a DUI or other such item on their record to mandatory only be allowed to be in vehicles that have self driving if the manufacturer is willing to assume liability. And if that doesn't happen then they might as well take a regular cab.


Yeah, I should clarify that my response is in anticipation of the comment sections I often see on hackernews about self driving cars, rather than arguments from the self driving car companies themselves. Waymo won't say "we're slightly better than people, let us on your roadways" but I feel like every time I see something about self driving on Hackernews there's a handful of commenters taking the hyper-utilitarian viewpoint of "they're better than people, we all need to let the robots do driving for us" which will never convince anyone outside these comment sections.


Also isnt waymo only operating in Pheonix and California where the weather is always nice?

What happens when it snows ?


They don't operate in snow now, but everybody was like "This is great, but it doesn't work in the rain!" this time last year. I've now taken many flawless rides in outright downpours in SF. It might take a year or two, but I have no doubt snow will be something that is a non-issue in the near future.


Sensing tech has to improve to get through the noise. As far as I'm aware, you can't see a yellow line or white line on a highway if it's covered in dirty snow.

We could also rework our roadways to include better sensing design and tech (passive or active!), but we are a ways out still from willingness to pay for that.


Do what humans do and drive without lines while trying to stay to one side of the road. This doesn’t seem like a major problem to self driving snow cars. More like, how do you deal with other drivers who are RWD with summer tires and are fish tailing all over the place.


If the sensors are up for it... Mountain roads covered in snow pack are a white blur, especially in certain light. Best bets for actually knowing where the road is might be previous tracks. I've dipped a wheel in a ditch more than once, not from carelessness, but snow drift obscuring the road... A road that I drive 500 times a year


To clarify, I'm not saying it's impossible to have a functional vehicle in dangerous conditions, simply that it's an area loaded with edge cases.


I don’t drive at all when it snows as a human. Our whole city basically shuts down then, and we are pretty far north as far as cities go.

Even in places that expect snow, cars are moving much more slowly and cautiously than normal. It almost seems like self driving cars would do much better in those situations given the speeds and caution involved.


> I don’t drive at all when it snows as a human.

That's the smart move. I've been quite astounded at family members without a driving license pushing family members with driving licenses to drive when they thought it wasn't safe. If it snows and it isn't 911 level urgent then simply stay put.


What is the point you're trying to make? Waymo isn't making claims about their driving performance in snow. And they're not comparing their performance to using a basket of human performance in all weather conditions, if you read the linked blog post they're specifically comparing to human driving performance in the regions in California and Arizona they're testing in.


Honestly, I never want a self-driving vehicle of any sort. I like driving, and if my going on about 8 years since even a speeding ticket is anything to go by, I'm pretty good at it. Driving is genuinely a leisure activity for me (only complicated by other, shit drivers) both in the usual way, and in the track-day way. So you could say I'm a car guy for sure.

I am all for autonomous vehicles for other people. And you can call that elitist and I frankly couldn't even argue with you, but god damn, so many people have utterly no business being in control of a car. I travel quite a bit both for work and leisure, almost always by motorway, and some of the shit I see is just mind-bending. So much inattentive driving, so much pointless risk-taking, as they say it's time for Wisconsin's favorite game: Why's that car being driven badly? Old, stupid, drunk, or all three?

And like, there's not really an answer for a lot of this. As much as I love them, cars suck on a society scale. They're basically a poor tax as even the most low-income people in my neck of the woods must have them to get around, there's a bus network but it's shit and it has very low ridership, and that's all my city has apart from taxis. And, old people use them for the same reason, even if they're damn well aware that they shouldn't really be driving anymore, if they don't have someone to get their groceries n such, what are they supposed to do?

So yeah, self-driving cars are awesome. In theory. But then I see video of someone using the lane-guidance on a Tesla where the car just sees... who knows, something, and suddenly jerks to the right, right at pedestrians. Or we get the stories of autonomous cars just shutting down and refusing to move, even for emergency vehicles. Still others, they end up parked on someone's legs, and the stupid support system can't allow a remote operator to move the vehicle off said leg.

At this point, as much as I hope we get it, I also kind of recognize that truly autonomous cars simply might not be viable yet. I think a much better idea is to just de-carify our societies. I don't want them gone entirely, but people need more options than JUST buying or leasing these enormous, polluting, dangerous machines that they do not want to learn to operate well. And like, I don't even judge people for that necessarily? I love them, to be clear, but that's something that's true for me and not necessarily everyone else, and if you don't care about something, you're not going to put your best effort into it. It becomes a chore. A necessary step to doing what you actually want to do, and as a result, people don't try and they just suck at it.

Edit: I'd much prefer and think it's more practical to create cities where cars are an option, not a requirement. If you WANT one, you're welcome to have one. And maybe it drives itself, or maybe you drive it. With those options being on the table, we can then raise the bar for driving tests which we badly need to do, there needs to be a higher skill floor for driving and more stringent requirements, especially as people age. Maybe once you get to a certain point in your life, as most of us probably will, you just can't have a manual car anymore. Then you can buy one that drives itself, or just join people on a robust public transit system.

This doesn't need to be hard, but it's made hard because tons of entities involved have a financial stake in keeping us dependent on cars. I think that's why self-driving cars are getting so much traction and funding, because then they get to sell you a new, more expensive car, instead of actually solving the problems of mass transit. And that just sucks.


> but I really don't feel like society will accept anything other than trucks / vehicles that are able to operate under all conditions

Trucks are different because they're just for transportation of goods. As challenging weather arrives they can just wait at the nearest rest stop, pull over on the shoulder, etc.

Your shipping will be delayed, but outside of that I don't think society will care. Totally acceptable as long as weather delays only last as long as a storm does.

That's different from rideshare where people do expect to be able to call a Waymo even if it's snowing lightly.


Society is literally willing to let truck drivers die to maintain shipping times. See the responses during the height of COVID-19, or even just the demands placed on every day operators cross-referenced with their average health and mortality rates. Our treatment of truck drivers is already subhuman, I don’t think we’re likely to extend a lot of understanding to the robots.


That's a bit of hyperbole. Lots of jobs take their toll on your health. That's nothing special about truck drivers, and it doesn't mean we "let them die".

And you're ignoring the fact that not requiring a driver is significantly cheaper. And when it's speed versus cost, cost generally wins. So yes, society is absolutely going to be fine with trucks pausing in bad weather, because it means cheaper shipping.

(And stuff that is urgent and requires next-day delivery is sent across the country by plane anyways, not by truck.)


On truckers dying on the job: you'd probably agree that there's arrangements that could save more truckers' lives but are not applied because of cost and lack of social pressure. For instance driver fatigue is said to be the primary cause of death among trucker, and that's obviously an issue we could throw money at to at least alleviate (but it wouldn't always be cheaper, that's the point)

And to your point, yes, that same logic would aplly to a ton of high risk occupations. There's building construction techniques that cost more but have less risk, but we won't apply them where there's no pressure to do so.

Are we letting them die then ? I don't see the hyperbole in saying yes, as a society we're not willing to put the money to reduce these deaths. I see it as a part of life, and don't think we'll change much even as autonomous vehicle come (if they kill 1 in 10000 people but cost way less than now, we'll swallow the pill collectively)


Driver fatigue is taken way more seriously than it used to be and measures and laws have been put in place to minimize it by forcing maximum driving and minimum resting periods.

So, ignoring the fact that society doesn't just get together every year and decide to let people die, I'd say it's absolute cynical hyperbole to say society is "letting truckers die".


"Taken way more seriously" is a step on the spectrum. Where did we put the cursor ? At the point where the cost vs the number of deaths kinda makes sense.

I think it's a poisonous to ignore that tradeoff and wash our hands of the whole issue. It's like eating meat ignoring where it came from. That doesn't mean we all take extreme positions and become vegetarians, but it's not something we can completely ignore saying it's hyperbole or cynical to recognize as a fact.


It's hyperbole and cynical to say we are "letting truckers die" and also that we "ignore that tradeoff and wash our hands of the whole issue".

Neither are true in any useful, productive sense. It's pure cynicism and myopic.


For the same reason, we'll likely be absolutely fine with them driving a bit slower as well once there's no driver to pay by the hour - that'll save money not just on the automation but on fuel cost as well.


That might be more a result of capitalist motivations i.e: funnel money to the share holders / owners and C-suite and away from workers. Which is much more direct than someone buying legos or a stove.

We can either legislate that or they need an effective union.


"That might be more a result of capitalist motivations"

People wanting to get more stuff faster is a universal motivation not limited (in theory or in practice) to capitalism. To the extent that anyone is exploited, that's just selfishness, again a universal motivation.


Can you provide a few examples of people expressing a general desire for "more stuff faster" before 1500 and outside Western Europe?


This is a bizarre framing to support the argument people don't like faster delivery of goods.

But in the spirit of good faith I will give some:

- Movement of fish/meat to markets before it spoils

- Delivery of military messages around the battlefield. Also supply trains for military expeditions

- Projects like the Grand Canal in China (https://en.wikipedia.org/wiki/Grand_Canal_(China)). Pretty much anywhere in history where people opted for using water transport of goods over land transport was done for speed/cost reasons.


I mean the copper ingot guy was sort of complaining that he hadn’t gotten enough (good) copper ingots and that his couriers had had to make multiple trips and had come back empty handed each time. This isn’t exactly “more stuff faster,” but it is pretty close.



The free-gratis supply of bread; as a daily life-preserving staple; to a portion of the population doesn't appear to match the spirit of "more stuff faster".

Surely basic food supply is "the bare minimum of stuff" and the rate is fixed to 'each day' (and the delivery to grain stores is 'each harvest in the supplying region'? Yes, it's more as populations grow and centralise, but that appears to be occurring the context of the question, no?


my point was not about the existence of a bread ration it was about the drive for ever bigger boats to haul ever more capacity.


>Trucks are different because they're just for transportation of goods. As challenging weather arrives they can just wait at the nearest rest stop, pull over on the shoulder, etc.

Trucks? waiting? In "just in time" culture of storage management, where buffers are low and delays are nightmare?


The same problem exists with manned and autonomous vehicles here though, it seems. Only for autonomous vehicles you don't have to worry about the safety of personnel. They might need digging out come the end of the storm?


There are way more trucks on the road than will fit in parking at rest areas, and simply parking on the shoulder is a hazard in itself, which is why trucks are required to set out warning triangles, etc. when they have to do it.


> There are way more trucks on the road than will fit in parking at rest areas

Not the ones that are taking multi-day trips. By definition, we've already made room for them since truckers generally sleep at night all at the same time.

This isn't about trucks on a 3-hour haul -- those just won't head out at all until the forecast is good.

And parking on the shoulder is a worst-case scenario if something suddenly happens that wasn't in the weather forecast for the next couple of horus.


Plus; AI shipping lanes. Humans for last mile and other issues.


The bar for commercial driving should definitely be higher than for “civilian”.

But I think the bar for civilian is also woefully out of date now. Giant trucks should require a commercial license. Make it easier to get than a delivery truck license, but weed out the soccer moms and bring back the station wagon. If you’re a professional tree trimmer, general contractor, or a forester, renewing your license is on the clock and not a big deal.


Pretty much done in Europe.

First of all for vehicles with total mass of above 3500 kg you need extra license. And also separate one if you have a trailer. This is get it and keep it, until certain age. But good enough often.

And then in general commercial operations also need more licensing. Which needs to be renewed after certain time.


This is a great idea and the first I've heard of it. Set a maximum weight of 5,000 lbs for an ordinary license. Minivans will come in just under the line. Small SUVs like the CRV or RAV4 will be allowed, as will small trucks like the Ford Maverick. Indeed, most of those vehicles are under 4,000 lbs. Even some larger SUVs like the Honda Pilot or (just barely) the Toyota 4runner get under the bar. Even a low-specced F-150. But nothing bigger than that.

Truth is, I'd prefer the bar at 4,000 lbs, which would limit us to Camrys and CRVs, but 5,000 would really allow just about any reasonable vehicle.

And big heavy EVs with massive acceleration are just too powerful for somebody who's got a Starbucks in one hand, a cell phone in the other, their knee on the wheel, and shouting kids in the back seat. "Pedal misapplication" will go tragic really fast.

5,000 lb limit.

Sincerely,

A Pedestrian


I actually think we should also include a hood height limit.

I would be for a CDL for any vehicle that is tall enough that an aver age pedestrian standing on the street cannot see over the hood.

I'm over 6 feet tall and I regularly have trouble doing this as a pedestrian in the USA. I can only imagine how it is to be smaller.


Absolutely. Those iconic 70’s car profiles will never be seen again because they suck pedestrians under the car and kill them. Trucks have some loopholes for this.

Even the Cooper Mini had to get taller about 10 years ago so the hood angle wouldn’t flip people head first into the windshield. And for side impact ratings. Those two are why the mini is so huge now. (I parked my midsized sedan next to a countryman the other day and when I came back I staggered and rolled my eyes because it is 20% bigger than my car in height and almost in width. Wtf).


It would also help to make it expensive to maintain/own vehicles over certain wait. California charges weight fees, but only for pickups, no car or SUV gets that weight fees.


A distracted driver can kill pedestrians just as easily in a 3999lb vehicle.


All other things being equal, that's true. All other things are not equal though. The 4000lb+ vehicles often have worse visibility, so you have to pay more attention to make sure a kid didn't wander in front of your truck while you were waiting to make a right turn.


>A distracted driver can kill pedestrians just as easily in a 3999lb vehicle.

No they can't - a lighter vehicle has a slightly harder time killing people. It's still incredibly easy, but the pedestrian fatality rates do go down the lighter the vehicle is.


Haven't we proven that the sense of invulnerability leads to worse outcomes as well?


So it's OK to have more deaths so long as we can punish someone with no expectation that doing so will fix the problem? I suspect you are right, but I don't think it reflects well on us.


I think a way to think about it is that we (at a societal level) wouldn't accept a desk calculator to be sold as a consumer product if it did correct arithmetic almost all of the time but then would spit out a close but wrong answer 1 time out of 1000, even if that would be significantly better than the average person doing arithmetic. Our expectation would be that it could (and should) perform flawlessly. If our technological progress was such that the only calculator we could ever hope to produce would still have that kind of error 1 time in 1000, would it be unethical to prevent that from being sold? That's hard to say!

One of the heuristics built into us (because we're mortal beings living beings in a competitive, historically resource-poor environment) is that we trust the devil we know more than the devil we don't, and so unless there's a strongly compelling reason to trust autonomous driving devices a lot more than humans, there will be some inertia against using them, even if the calculations are that it will save X number of lives. I mean, inherent in that calculation is a level of uncertainty and people don't necessarily trust that number, because they don't have a reason to trust it, because they haven't really seen enough to trust it. Why take a company's word for it that it's safer when they have a financial incentive to do some creative stuff to get their marketing pitch? I would say that if you feel it doesn't reflect well on us, it's because it hasn't been thought about enough.


I think intuitively the desk calculator analogy makes sense, but from a technology perspective it does not. A calculator has a limited number of operations and operating modes, and you can feasibly test all of them, quickly, easily, and in an automated fashion.

An autonomous vehicle has an uncountable number of operating modes, and it is not feasible (perhaps not even possible) to test it in all possible conditions and states. Even if you could, doing so for (say) every single software change would take years each time.

Maybe that does mean that this is a fool's errand, and we just shouldn't be building autonomous cars, at least not until we have AGI that can think and act faster and with better judgment than a human.

I personally do think that "better record than a human driver" should be sufficient (perhaps with some significant, TBD margin; 0.1% better is probably not enough), but I accept and agree with your toplevel comment that sort of thing won't fly in the real world. The bar is really more like the self-driving car has to avoid making the specific kinds of mistakes and illegal/unsafe acts that a human driver would do (all while not creating new classes of mistakes that a human driver would not make), and, on top of that, be better than a human driver in situations where crash would not be deemed that human driver's fault.


I don't disagree that a calculator and a car are different things, but the important thing for the analogy is that the wider society expects the same thing of them. When John Q Public thinks about self-driving cars, he thinks to himself "How hard can it be? Drive the speed limit, stop at stop signs and traffic signals, go when lights turn green, stop at obstacles and stay between the lane dividing markers. If these vehicles can't do that then what good are they? Why risk letting those things on the road if I can't even trust them to do that?" Obviously you and I both know that the base things like "recognizing a stop sign" required decades of research to become reliable, but like, back in the 1960s before the problem had work started on it, even PhD holding computer scientists were thinking that object recognition was basically a done deal.

What I want to keep reminding the software developers with consequentialist ethics is that the entire rest of the world operates under a totally different mental framework than the techno-utopian one. In that mental model, a self driving car is an appliance and it will be judged by the standard of all appliances, which is that any catastrophic failures that happen during their normal operation are evidence of a defective, untrustworthy product. The problem they're trying to solve and its complexity has little bearing in that judgment. What incentive does anyone have to be generous and forgiving about potentially fatal errors when you're not taking the consequentialist viewpoint for granted? That's the thrust of my observation.


I have a desk calculator with a worse error rate than that because of a button that sticks. I still use it


I was thinking about this and one way of thinking is what u said. We already allow sale of spaceships which have non-zero failure rate. So we don't necessarily need zero failure rate self driving.

But a problem with cars is, usually ur malfunctioning calculator just harms u, but a malfunctioning car will affect people who don't agree with ur choice of driving non-zero failure rate self driving car.

So it is not a personal choice anymore.


The rest of gen AI tools might prove that wrong- Need help with your homework? We have software that will only sometimes give you facts that are incorrect. The res of the time it does great!


A respectable track record and a margin for error are fine things to insist on, but that was not the argument being presented or objected to.


> we wouldn't allow someone to be an Uber or Lyft or truck driver if they were candid and said "I'm going to text on my phone, and be drunk driving and sleepy and distracted as often as the average motorist"

Half the taxi and Uber drivers in New York are perpetually on a phone call, and frequently interacting with their phones.


I'll take your word for it (not being an American and only having visited NYC once), but even then most of the rules of the road are… not perfectly enforced. I suspect if the rules were perfectly enforced, the only humans allowed to drive would be those who actually don't.

I suspect — no, I hope — that anyone who admits in advance that they intend to break the rules, won't get a license.


I mean just look at how popular automated enforcement like speed and red light cameras.


Not sure which side you're arguing. Does the US even have automated speed traps? Every time I hear about them, there's opposition and it doesn't happen. I think the most recent one I heard about in the US, the plan was to issue a ticket unless the speed is at least 11mph over the limit, which seems... both fine but also kinda silly?

Red light cameras were even a big fight to get implemented back when they were a new idea, though they're relatively uncontroversial these days.

To me, the difference is that the speed limit is intentionally poorly enforced, because law enforcement knows that speed limits in many places are set inconsistently and at unreasonable levels (and I believe that's also intentional; I recall reading that limits in many places are set at the 85% percentile of actual measured traffic speed, or something like that). An automated speed trap can't judge whether a particular speed is safe for the current weather and traffic conditions.

On the other hand, it's never safe to run a red light; that's just a binary "you did it"/"you didn't do it". Yes, I know, technically it is safe to run a red light when visibility is plentiful and there's no cross-traffic within sight, but I think we as a society have accepted the idea that you just shouldn't run red lights, and getting ticketed for doing so is fine under nearly any circumstances.


Really the bar for humans to legally operate a vehicle is whatever licensing process is in place in a given state. We don't make them pinky swear to not do anything dangerous or illegal.

If an autonomous system can get a CDL, it's probably gonna be more effective at continuously meeting that standard than a lot of the humans that do the same thing (but are distracted on a given day or have started using substances, or didn't sleep well or whatever).


> Really the bar for humans to legally operate a vehicle is whatever licensing process is in place in a given state. We don't make them pinky swear to not do anything dangerous or illegal.

That is not sufficient for self-driving vehicles. The license is the thing that shows "okay, this person seems safe" but then some of the things that keep them safe are the threat of accidentally killing themselves, or being arrested and put in jail for a crime they commit while driving, or the financial penalty of being sued for an injury / damage they cause, or the risk of having their license taken away for errors. If a human being applying for their license was invincible, incapable of being jailed, sued, or having their license taken away, you might expect that our CDL or other licensing processes would be more stringent.


Wait, why do you think that permission to operate the autonomous vehicles would not be withdrawn? It's happened!

It's totally bizarre to give mediocre humans credit for their self preservation and then demand that an autonomous system be beyond superb because it doesn't have it. Statistically, the mediocrity is going to be the bigger problem.


Maybe, but our driving tests are designed around assessing humans. A system that fundamentally works quite differently likely has very different performance characteristics and failure modes and may need to be assessed differently to demonstrate that it will probably perform adequately in non-test conditions.


> We'd demand a way higher standard because these companies don't really have any skin in the game, except for financial penalties which we now understand is not a deterrent for anything.

I'll add that one of the penalties for accidents caused by human error is a prison term.

Until a Tesla/Uber/Waymo exec responsible for autonomous driving can serve a prison term for accidents caused by their service, the penalty for accidents caused by autonomous driving is orders of magnitude lighter than those applied to human drivers.


We should also demand that these companies are run by engineers, not MBA types. Otherwise, we'd end up with neglect of safety measures as seen at Boeing. And I'm not sure if any kind of external incentive could change that.


> one of the penalties for accidents caused by human error is a prison term.

Really? How does anyone get that. I've only seen murder with a car worth 90 days in jail and loss of the car.


> Really? How does anyone get that.

In the US, vehicular homicide comes with a sentence that's typically between 9 and 30 years.


> like, we wouldn't allow someone to be an Uber or Lyft or truck driver if they were candid and said "I'm going to text on my phone, and be drunk driving and sleepy and distracted as often as the average motorist".

But we do allow it. Sure its illegal but we don't put in the effort to actually prevent it. Just because we don't like it and would want it to be better doesn't mean we aren't allowing it.


> We'd demand a way higher standard because these companies don't really have any skin in the game, except for financial penalties which we now understand is not a deterrent for anything.

As we have seen with the titan sub, not even death is a full deterrent. A lot of companies will still risk it for the biscuit and take the risk of getting jailed for their shot at leading the market (in fact, a lot of people risk actual jailtime for financial gain). Having strict regulation and not allowing subpar cars on the street really is the only way.


I agree, though I either don't see (or I see and misperceive) many examples of people mistaking the utilitarian result for the politically acceptable one.


trucking autonomous driving needs to be parity or better than the average truck driver not the average driver.


The problem is that if you have safer than human systems but you don't deploy them it's basically killing people. If you had autonomous driving that was 90% as deadly as the average human driver, not deploying it would cause 4000 deaths every year in the US.


We have a lot safer transportation systems than private cars but don't deploy them. Is this basically killing people too? Driving is murder?


Yes. There are a lot of people who are driving for things that a good public transit system could handle. Sure you can have your SUV for towing the boat to the lake - but you only do that on weekends (and no every weekend) - why are you driving to work, to church, get groceries - for most people those are tasks that that a good transit system could handle.

Note that by good transit system I mean one where you never check the schedule because the next bus/train is never more than 5 minutes (that is a maximum time). By good transit I also mean a system of transfers to express service so you can get to more distance places quickly. Nobody in the US has a good system (NYC's subways don't even come that often!), but there is no reason we can't have that, and some places in the world do have it (at least for a few).


> the next bus/train is never more than 5 minutes [..] there is no reason we can't have that

There's no logistical reason, but very few municipalities have the density to support that kind of frequency (via fare revenue), and taxpayers (especially in the US, but in other places as well) don't want to pay for all that unoccupied space on trains and buses.

I'm not saying that's a universal truth. There are a few places that do have the density to support that. And there are a few places that don't, but local residents are happy to pay more in taxes to get that kind of frequency. But most places are not like either of those things.


Yeah I’d totally take transit so long as it will pick me up in my driveway, is ready within a minute of me deciding I need to leave, and is a ride for only me and my family/friends.


This attitude is why we can't have nice things. You have probably never experienced a system like I outlined and you won't even consider it without adding other requirements that drive up the cost until we cannot get it.

If instead you would accept that you can wait up to five minutes, and "other people" are nice enough and so you can share a space with them - we could have a nice transit system.

The five minute time I used above is not arbitrary. While there is some debate on the exact number, 5 minutes allows for a lot operational things that are really nice.


I have ridden a whole lot of public transport all over the world and have seen what you are describing.


Just because we both want different things doesn't mean they both can't exist.


It's not murder. Relying excessively on a dangerous mode of transportation is killing people.


The same way as getting out of the house during a pandemic was similar to "killing people" for a big chunk of the educated Western electorate.

You can see how that viewpoint might be received in a lot of places nowadays, especially blue-collar places that rely on that type of work to still be available and not farmed out to some AI bots that will manage to also kill some of us in the process (but in smaller numbers compared to what our fellow humans would have registered, that's a relief).


Negligence is not murder but people still die.


Many people are safer than average driver, since most deaths are associated with high risk driving (more than a third are dui alone). So forcing everyone to use autonomous vehicles essentially kills safe drivers.


since this is literally the trolley problem, the issue here is whether you want to take authorship of the deaths or not. I personally never move the lever in those puzzles.


That's only if you're able to deploy it in a way that the human drivers you replace have the same risk distribution as the human driver population as a whole. If your deployment instead skews away from replacing the high-risk human drivers, the 90% AV could make the overall situation worse.


Truck or cars are not strictly required to operate in all conditions in the current system: roads, entire freeways close when the conditions demand. Or demand that vehicles stop and put on chains. There is precedent.


I think that if you were reading my comment charitably you could infer that I meant "under all conditions where a car is legally allowed to operate." I don't think you actually believed that I was saying that a truck should have the capability to drive on the road when a roadway is not legal to drive on.


You are seeing a fight that isn't there.

There is precedent for either roads or vehicles for choosing not to handle some conditions. In a driving rain storm you will find rest areas stuffed full of trucks and cars that picked that time to nap. In San Francisco Waymo first served some hours and not others, some areas and not others - some of these choices apparently related to fog and wet pavement. In Oregon, you will find roads open, but white-out-ed. Etc, etc. Nobody forces human vehicles to keep moving and most of the ones with some sense make choices. Which are available also to automated vehicles. - Sure it should be expected of the vehicle to find a safer place to move itself than right in the middle of the lane, like I point out in another response here.


> Truck or cars are not strictly required to operate in all conditions in the current system: roads, entire freeways close when the conditions demand. Or demand that vehicles stop and put on chains. There is precedent.

I think you're confusing things. Weather changes midway anyone's drive, and all drivers are required to drive safely and reliably even during sudden extreme meteorological events.

For a road to be closed, it takes an administrative action that reflects a decision that's largely arbitrary. Until a third party makes that decision, any driver is required to drive safely and reliability, regardless of the weather.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: