91

Waymo robotaxi hits a child near an elementary school in Santa Monica

From the Waymo blog...

> the pedestrian suddenly entered the roadway from behind a tall SUV, moving directly into our vehicle's path. Our technology immediately detected the individual as soon as they began to emerge from behind the stopped vehicle. The Waymo Driver braked hard, reducing speed from approximately 17 mph to under 6 mph before contact was made.

> Following contact, the pedestrian stood up immediately, walked to the sidewalk, and we called 911. The vehicle remained stopped, moved to the side of the road, and stayed there until law enforcement cleared the vehicle to leave the scene.

> Following the event, we voluntarily contacted the National Highway Traffic Safety Administration (NHTSA) that same day.

I honestly cannot imagine a better outcome or handling of the situation.

3 hours agoBugsJustFindMe

Yup. And to add

> Waymo said in its blog post that its “peer-reviewed model” shows a “fully attentive human driver in this same situation would have made contact with the pedestrian at approximately 14 mph.”

It's likely that a fully-attentive human driver would have done worse. With a distracted driver (a huge portion of human drivers) it could've been catastrophic.

2 hours agojobs_throwaway

I think my problem is that it reacted after seeing the child step out from behind the SUV.

An excellent driver would have already seen that possible scenario and would have already slowed to 10 MPH or less to begin with.

(It's how I taught my daughter's to drive "defensively"—look for "red flags" and be prepared for the worst-case scenario. SUV near a school and I cannot see behind it? Red flag—slow the fuck down.)

14 minutes agoJKCalhoun

Yes and no. Tons of situations where this is simply not possible, whole traffic goes full allowed speed next to row of parked cars. If somebody unexpectedly pops up distracted, its a tragedy guaranteed regardless of driver's skills and experience.

In low traffic of course it can be different. But its unrealistic to expect anybody to drive in expectation that behind every single car passed there may be a child jumping right in front of the car. That can be easily thousands of cars, every day, whole life. Impossible.

We don't read about 99.9% of the cases where even semi decent driver can handle it safely, but rare cases make the news.

7 minutes agokakacik

I kind of drive that way. I slow down, move as far away in my lane from the parked cars as possible. It's certainly what I would expect from a machine that would claim to be as good as the best human driver.

5 minutes agoJKCalhoun

Possibly, but Waymos have recently been much more aggressive about blowing through situations where human drivers can (and generally do) slow down. As a motorcyclist, I've had some close calls with Waymos driving on the wrong side of the road recently, and I had a Waymo cut in front of my car at a one-way stop (t intersection) recently when it had been tangled up with a Rivian trying to turn into the narrow street it was coming out of. I had to ABS brake to avoid an accident.

Most human drivers (not all) know to nose out carefully rather than to gun it in that situation.

So, while I'm very supportive of where Waymo is trying to go for transport, we should be constructively critical and not just assume that humans would have been in the same situation if driving defensively.

2 hours agochaboud

Certainly, I'm not against constructive criticism of Waymo. I just think it's important to consider the counterfactual. You're right too that an especially prudent human driver may have avoided the scenario altogether, and Waymo should strive to be that defensive.

2 hours agojobs_throwaway

Absolutely, I can tell you right now that many human drivers are probably safer than the Waymo, because they would have slowed down even more and/or stayed further from the parked cars outside a school; they might have even seen the kid earlier in e.g. a reflection than the Waymo could see.

2 hours agoveltas

It seems it was driving pretty slow (17MPH) and they do tend to put in a pretty big gap to the right side when they can.

There are kinds of human sensing that are better when humans are maximally attentive (seeing through windows/reflections). But there's also the seeing-in-all-directions, radar, superhuman reaction time, etc, on the side of the Waymo.

2 hours agomlyle

[dead]

2 hours agoonetokeoverthe

I usually take extra care when going through a school zone, especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?), and overtaking is something I would probably never do (and should be banned in school zones by road signs).

This is a context that humans automatically have and consider. I'm sure Waymo engineers can mark spots on the map where the car needs to drive very conservatively.

2 hours agotorginus

> especially when I see some obstruction ('behind a tall SUV', was the waymo overtaking?)

Yep. Driving safe isn't just about paying attention to what you can see, but also paying attention to what you can't see. Being always vigilant and aware of things like "I can't see behind that truck."

Honestly I don't think sensor-first approaches are cut out to tackle this; it probably requires something more akin to AGI, to allow inferring possible risks from incomplete or absent data.

36 minutes agomikkupikku

I appreciate your sensible driving, but here in the UK, roads outside schools are complete mayhem at dropping off/picking up times. Speeding, overtaking, wild manoeuvres to turn round etc.

When reading the article, my first thought was that only going at 17mph was due to it being a robotaxi whereas UK drivers tend to be strongly opposed to 20mph speed limits outside schools.

an hour agondsipa_pomu
[deleted]
2 hours ago

It's possible, but likely is a heavy assertion. It's also possible a human driver would have been more aware of children being present on the sidewalk and would have approached more cautiously given obstructed views.

Please please remember that any data from Waymo will inherently support their position and can not be taken at face value. They have significant investment in making this look more favorable for them. They have billions of dollars riding on the appearance of being safe.

2 hours agomicromacrofoot

I wonder if that is a "fully attentive human drive who drove exactly the same as the Waymo up until the point the child appeared"?

Personally, I slow down and get extra cautious when I know I am near a place where lots of kids are and sight lines are poor. Even if the area is signed for 20 I might only be doing 14 to begin with, and also driving more towards the center of the road if possible with traffic.

35 minutes agoIncreasePosts

It depends on the situation, and we need more data/video. But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.

2 hours agoscarmig

> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast, and the Waymo should have been driving more conservatively.

UK driving theory test has a part called Hazard Perception: not reacting on children milling around would be considered a fail.

[0] https://www.safedrivingforlife.info/free-practice-tests/haza...

10 minutes agokilotaras

Exactly. That’s why I’ve always said the driving is a truly AGI requiring activity. It’s not just about sensors and speed limits and feedback loops. It’s about having a true understanding for everything that’s happening around you:

Having an understanding for the density and make up of an obstacle that blew in front of you, because it was just a cardboard box. Seeing how it tumbles lightly through the wind, and forming a complete model of its mass and structure in your mind instantaneously. Recognizing that that flimsy fragment though large will do no damage and doesn’t justify a swerve.

Getting in the mind of a car in front of you, by seeing subtle hints of where the driver is looking down, and recognizing that they’re not fully paying attention. Seeing them sort of inch over because you can tell they want to change lanes, but they’re not quite there yet.

Or in this case, perhaps hearing the sounds of children playing, recognizing that it’s 3:20 PM, and that school is out, other cars, double parked as you mentioned, all screaming instantly to a human driver to be extremely cautious and kids could be jumping out from anywhere.

2 hours agomatt-attack

> But if there are a bunch of children milling about an elementary school in a chaotic situation with lots of double parking, 17 mph is too fast

Hey, I'd agree with this-- and it's worth noting that 17^2 - 5^2 > 16^2, so even 1MPH slower would likely have resulted in no contact in this scenario.

But, I'd say the majority of the time it's OK to pass an elementary school at 20-25MPH. Anything carries a certain level of risk, of course. So we really need to know more about the situation to judge the Waymo's speed. I will say that generally Waymo seems to be on the conservative end in the scenarios I've seen.

(My back of napkin math says an attentive human driver going at 12MPH would hit the pedestrian at the same speed if what we've been told is accurate).

2 hours agomlyle

Swedish schools still have students who walk there. I live near one and there are very few cars that exceed 20km/h during rush hours. Anything faster is reckless even if the max over here is 30 km/h (19 mph).

an hour agopastage

The schools I'm thinking of have sidewalks with some degree of protection/offset from street, and the crossings are protected by human crossing guards during times when students are going to schools. The posted limits are "25 (MPH) When Children Are Present" and traffic generally moves at 20MPH during most of those times.

There are definitely times and situation where the right speed is 7MPH and even that feels "fast", though, too.

an hour agomlyle

Whoa! You're allowed to double park outside a school over there?!

2 hours agodrcongo

People loitering in their cars waiting for a space to pick up their kid. So not actually parked.

2 minutes agodboreham

Wait, is double parking allowed anywhere?

36 minutes agorecursive

Pretty common at airports; of course, the `parking` only lasts a few minutes at most.

5 minutes agosomething765478

The autonomous vehicle should know what it can't know, like children coming out from behind obstructions. Humans have this intuitive sense. Apparently autonomous systems do not, and do not drive carefully, or slower, or give more space, in those situations. Does it know that it's in a school zone? (Hopefully.) Does it know that school is starting or getting out? (Probably not.) Should it? (Absolutely yes.)

This is the fault of the software and company implementing it.

5 minutes agomholt

AV’s with enough sensing are generally quite good at stopping quickly. It is usually the behavior prior to the critical encounter that has room for improvement.

The question will be whether 17 mph was a reasonably cautious speed for this specific scenario. Many school zones have 15 mph limits and when there are kids about people may go even slower. At the same time, the general rule in CA for school zone is 25 mph. Clearly the car had some level of caution which is good.

2 hours agocalchris42

They they are being very transparent about it.

2 hours agorandom_duck

As every company should, when they have a success. Are they also as transparent about their failures?

2 hours agodirewolf20

How is hitting a child not a failure? And actually, how can you call this a success? Do you think this was a GTA side mission?

2 hours agodylan604

Why didn't sully just not hit the birds?

36 minutes agotrillic

Immediately hitting the brakes when a child suddenly appears in front of you, instead of waiting 500ms like a human, and thereby hitting the child at a speed of 6 instead of 14 is a success.

What else to you expect them to do, only run on grade–separated areas where children can't access? Blare sirens so children get scared away from roads? Shouldn't human–driven cars do the same thing then?

an hour agodirewolf20

I don't know the implementation details, but success would be not hitting pedestrians. You have some interesting ideas on how to achieve that but there might be other ways, I don't know.

35 minutes agorecursive

17 mph is way too fast near a school if it's around the time children are getting out (or in).

25 minutes agoorwin

The limit is 20 MPH in Washington state, in California the default is 25 MPH, but is going to 20 MPH soon and can be further lowered to 15 MPH with special considerations.

The real killer here is the crazy American on street parking, which limits visibility of both pedestrians and oncoming vehicles. Every school should be a no street parking zone. But parents are going to whine they can't load and unload their kids close to the school.

22 minutes agoseanmcdirmid

"and thereby hitting the child ... is a success."

> What else to you expect them to do, only run on grade–separated areas where children can't access?

no, i expect them to slow down when children may be present

24 minutes agoparl_match

how slow?

21 minutes agodirewolf20

Well, as a comparison, we know that Tesla has failed to report to NHTSA any collisions that didn't deploy the airbag.

2 hours agoBugsJustFindMe

Is this a success? There was still an incident. I'd argue this was them being transparent about a failure

2 hours agovoidUpdate

Being transparent about such incidents is also what stops them from potentially becoming a business/industry-killing failures. They're doing the right thing here, but they also surely realize how much worse it would be if they tried to deny or downplay it.

2 hours agoTeMPOraL

They handled an unpredictable emergency situation better than any human driver.

2 hours agodirewolf20

as far as we know

2 hours agomicromacrofoot

It does sound like a good outcome for automation. Though I suppose an investigation into the matter would arguably have to look at whether a competent human driver would be driving at 17mph (27km/h) under those circumstances to begin with, rather than just comparing the relative reaction speeds, taking the hazardous situation for granted.

What I would like to see is a full-scale vehicle simulator where humans are tested against virtual scenarios that faithfully recreate autonomous driving accidents to see how "most people" would have acted in the minutes leading up to the event as well as the accident itself

2 hours agodcanelhas

17 mph is pretty slow unless it’s a school zone

2 hours agoaaomidi

Indeed, 15 or 25 mph (24 or 40 km/h) are the speed limits in school zones (when in effect) in CA, for reference. But depending on the general movement and density and category of pedestrians around the road it could be practically reckless to drive that fast (or slow).

an hour agodcanelhas

If my experience driving through a school zone on my way to work is anything to go off of, I rarely see people actually respecting it. 17 mph would be a major improvement over what I'm used to seeing.

30 minutes agoTeknoman117

It’s great handling of the situation. They should release a video as well.

2 hours agodyauspitr

Indeed. Rather than having the company telling me that they did great I'd rather make up my own mind and watch the video.

2 hours agodust42

We should take their reporting with grain of salt and wait for official results

30 minutes agocroes

I honestly think that Waymo's reaction was spot on. I drop off and pick up my kid from school every day. The parking lots can be a bit of a messy wild west. My biggest concern is the size of cars especially those huge SUV or pickup trucks that have big covers on the back. You can't see anything incoming unless you stick your head out.

2 hours agordudek

EDIT: replies say I'm misremembering, disregard.

2 hours agoveltas

That was Cruise, and that was fixed by Cruise ceasing operations.

2 hours agochaboud

I don’t think that was Waymo right? Cruise is already wound down as far as I know.

2 hours agoseanmcdirmid

> I honestly cannot imagine a better outcome or handling of the situation.

If it can yell at the kid and send a grumpy email to the parents and school, the automation is complete.

30 minutes agolostlogin

Most humans in that situation won't have reaction speed to do shit about it and it could result in a severe injury or death.

2 hours agoanovikov

Yeah. I'm a stickler for accountability falling on drivers, but this really can be an impossible scenario to avoid. I've hit someone on my bike in the exact same circumstance - I was in the bike lane between the parked cars and moving traffic, and someone stepped out between parked vehicles without looking. I had nowhere to swerve, so squeezed my brakes, but could not come to a complete stop. Fortunately, I was going slow enough that no one was injured or even knocked over, but I'm convinced that was the best I could have done in that scenario.

The road design there was the real problem, combined with the size and shape of modern vehicles that impede visibility.

2 hours agogensym

Building on my own experience I think you have to own that if you crash with someone you made a mistake. I do agree that car and road design for bicycles(?) makes it almost impossible to move around if you do not risk things like that.

an hour agopastage

Humans are not going to win on reaction time but prevention is arguably much more important.

2 hours agojayd16

How would standard automatic breaking (standard in some brands) have performed here?

2 hours agolokar

Meanwhile the news does not report the other ~7,000 children per year injured as pedestrians in traffic crashes in the US.

I think the overall picture is a pretty fantastic outcome -- even a single event is a newsworthy moment _because it's so rare_ .

> The NHTSA’s Office of Defects Investigation is investigating “whether the Waymo AV exercised appropriate caution given, among other things, its proximity to the elementary school during drop off hours, and the presence of young pedestrians and other potential vulnerable road users.”

Meanwhile in my area of the world parents are busy, stressed, and on their phones, and pressing the accelerator hard because they're time pressured and feel like that will make up for the 5 minutes late they are on a 15 minute drive... The truth is this technology is, as far as i can tell, superior to humans in a high number of situations if only for a lack of emotionality (and inability to text and drive / drink and drive)... but for some reason the world wants to keep nit picking it.

A story, my grandpa drove for longer than he should have. Yes him losing his license would have been the optimal case. But, pragmatically that didn't happen... him being in and using a Waymo (or Cruise, RIP) car would have been a marginal improvement on the situation.

an hour agomaerF0x0

Wow this is why I feel comfortable in a Waymo. Accidents are inevitable and some point and this handling was well-rehearsed and highly ethical. Amazing company

6 minutes agoripped_britches

That sucks, and I love to hate on "self driving" cars. But it wasn't speeding to start with (assuming speed limit in the school zone was 20 or 25), braked as much as possible, and the company took over all the things a human driver would have been expected to do in the same situation. Could have been a lot worse, probably wouldn't have been any better with a human driver (just going to ignore as no-signal Waymo's models that say an attentive human driver would have been worse). It's "fine". In this situation, cars period are the problem, not "self driving" cars.

16 minutes agoNoGravitas

Who is legally responsible in case a Waymo hits a pedestrian? If I hit somebody, it's me in front of a judge. In the case of Waymo?

23 minutes agopmontra

Are you thinking of civil liability or criminal liability?

Waymo is liable in a civil sense and pays whatever monetary amount is negotiated or awarded.

For a criminal case, some kind of willful negligence would have to be shown. That can pierce corporate veils. But as a result Waymo is being extremely careful to follow the law and establish processes which shield their employees from negligence claims.

6 minutes agohiddencost

I'm curious as to what kind of control stack Waymo uses for their vehicles. Obviously their perception stack has to be based off of trained models, but I'm curious if their controllers have any formal guarantees under certain conditions, and if the child walking out was within that formal set of parameters (e.g. velocity, distance to obstacle) or if it violated that, making their control stack switch to some other "panic" controller.

This will continue to be the debate—whether human performance would have exceeded that of the autonomous system.

2 hours agosimojo

From a purely stats pov, in situations where the confusion matrix is very asymmetric in terms of what we care about (false negatives are extra bad), you generally want multiple uncorrelated mechanisms, and simply require that only one flips before deciding to stop. All would have to fail simultaneously to not brake, which becomes vanishingly unlikely (p^n) with multiple mechanisms assuming uncorrelated errors. This is why I love the concept of Lidar and optical together.

2 hours agoenergy123

Personally in LA I had a Waymo try to take a right as I was driving straight down the street. It almost T-boned me and then honked at me. I don’t know if there has been a change to the algorithm lately to make them more aggressive but it was pretty jarring to see it mess up that badly

2 hours agoBukhmanizer

In recent weeks I've found myself driving in downtown SF congestion more than usual, and observed Waymos doing totally absurd things on multiple occasions.

The main saving grace is they all occurred at low enough speeds that the consequences were little more than frustrating/delaying for everyone present - pedestrians and drivers alike, as nobody knew what to expect next.

They are very far from perfect drivers. And what's especially problematic is the nature of their mistakes seem totally bizarre vs. the kinds of mistakes human drivers make.

3 minutes agopengaru

It honked at you? But local laws dictate that it angrily flashes its high beams at you.

2 hours agojayd16

And before the argument "Self driving is acceptable so long as the accident/risk is lower than with human drivers" can I please get that out of the way: No it's not. Self driving needs to be orders of magnitude safer for us to acknowledge it. If they're merely as safe or slightly safer than humans we will never accept it. Becase humans have a "skin in the game". If you drive drunk, at least you're likely to be in the accident, or have personal liability. We accept the risks with humans because those humans accept risk. Self driving abstracts the legal risk, and removes the physical risk.

I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

2 hours agoalkonaut

I think those figures are already starting to accumulate. Incidents like this are rare enough that they are news worthy. Almost every minor incident involving Waymo, Tesla's FSD, and similar solutions gets a lot of press. This was a major incident with a happy end. Those are quite rare. The lethal ones even rarer.

As for more data, there is a chicken egg problem. A phased roll out of waymo over several years has revealed many potential issues but is also remarkable in the low number of incidents with fatalities. The benefit of a gradual approach is that it builds confidence over time.

Tesla has some ways to go here. Though arguably, with many hundreds of thousands of paying users, if it was really unsafe, there would be some numbers on that. Normal statistics in the US are measured in ~17 deaths per 100K drivers per year. 40K+ fatalities overall. FSD for all its faults and failings isn't killing dozens of people per years. Nor is Waymo. It's a bit of an apples and oranges comparison of course. But the bar for safety is pretty low as soon as you include human drivers.

Liability weighs higher for companies than safety. It's fine to them if people die, as long as they aren't liable. That's why the status quo is tolerated. Normalized for amounts of miles driven with and without autonomous, there's very little doubt that autonomous driving is already much safer. We can get more data at the price of more deaths by simply dragging out the testing phase.

Perfect is the enemy of good here. We can wait another few years (times ~40K deaths) or maybe allow technology to start lowering the amount of traffic deaths. Every year we wait means more deaths. Waiting here literally costs lives.

2 hours agojillesvangurp

> ~17 deaths per 100K drivers per year. 40K+ fatalities overall.

I also think one needs to remember those are _abysmal_ numbers, so while the current discourse is US centric (because that's where the companies and their testing is) I don't think it can be representative for the risks of driving in general. Naturally, robotaxis will benefit from better infra outside the US (e.g. better separation of pedestrians) but it'll also have to clear a higher safety bar e.g. of fewer drunk drivers.

2 hours agoalkonaut

It will also never get worse. This is the worst the algorithms from this point forward.

21 minutes agotrillic

> I'm willing to accept robotaxis, and accidents in robotaxis, but there needs to be some solid figures showing they are way _way_ safer than human drivers.

Do you mean like this?

https://waymo.com/safety/impact/

2 hours agojonas21

Yes but ideally from some objective source.

2 hours agoalkonaut

If waymo is to be believed, they hit the kid at 6mph and estimated that a human driver at full attention would have hit the kid at 14 mph. The waymo was traveling 17mph. The situation of "kid running out between cars" will likley never be solved either, because even with sub nanosecond reaction time, the car's mass and tire's traction physically caps how fast a change in velocity can happen.

I don't think we will ever see the video, as any contact is overall viewed negatively by the general public, but for non-hyperbolic types it would probably be pretty impressive.

2 hours agoWarmWash

That doesn't mean it can't be solved. Don't drive faster than you can see. If you're driving 6 feet from a parked car, you can go slow enough to stop assuming a worst case of a sprinter waiting to leap out at every moment.

32 minutes agorecursive

If we adopted that level of risk, we'd have 5mph speed limits on every street with parking. As a society, we've decided that's overly cautious.

16 minutes agocrazygringo

> The situation of "kid running out between cars" will likley never be solved

Nuanced disagree (i agree with your physics), in that an element of the issue is design. Kids running out between cars _on streets that stack building --> yard --> sidewalk --> parked cars --> driving cars.

One simple change could be adding a chain link fence / boundary between parked cars and driving cars, increasing the visibility and time.

an hour agomaerF0x0

How do you add a chain link fence between the parked and driving cars for on-street parking?

20 minutes agotoast0

there's still an inlet and outlet (kinda like hotel pickup/drop off loops). It's not absolutely perfect, but it constrains the space of where kids can dart from every parked car to 2 places.

Also the point isn't the specifics, the point is that the current design is not optimal, it's just the incumbent.

6 minutes agomaerF0x0

Oh I have no problem believing that this particular situation would have been handled better by a human. I just want hard figures saying that (say) this happens 100x more rarely with robotaxis than human drivers.

2 hours agoalkonaut

Orders of magnitude? Something like 100 people die on the road in the US each day. If self-driving tech could save 10 lives per day, that’s wouldn’t be good enough?

2 hours agocriddell

"It depends". If 50 people die and 50 people go to jail, vs. 40 people die and their families are left wondering if someone will take responsibility? Then that's not immediately standing out as an improvement just because fewer died. We can do better I think. The problem is simply one of responsibility.

2 hours agoalkonaut

People don't usually go to jail. Unless the driver is drunk or there's some other level of provable criminal negligence (or someone actively trying to kill people by e.g. driving into a crowd of protesters they disagree with), it's just chalked up as an accident.

12 minutes agocrazygringo

If the current situation was every day 40 people die but blame is rarely assigned, would you recommend a change where an additional 10 people are going to die but someone will be held responsible for those deaths?

44 minutes agocriddell

Do they go to jail?

That is not my experience here in the Bay Area. In fact here is a pretty typical recent example https://www.nbcbayarea.com/news/local/community-members-mour...

The driver cuts in front of one person on an e-bike so fast they can’t react and hit them. Then after being hit they step on the accelerator and go over the sidewalk on the other side of the road killing a 4 year old. No charges filed.

This driver will be back on the street right away.

22 minutes agorenewiltord

Have you been in a self driving car? There are some quite annoying hiccups, but they are already very safe. I would say safer than the average driver. Defensive driving is the norm. I can think of many times where the car has avoided other dangerous drivers or oblivious pedestrians before I realized why it was taking action.

2 hours agojtrueb
[deleted]
2 hours ago

I generally agree the bar is high.

But, human drivers often face very little accountability. Even drunk and reckless drivers are often let off with a slap on the wrist. Even killing someone results in minimal consequences.

There is a very strong bias here. Everyone has to drive (in most of America), and people tend to see themselves in the driver. Revoking a license often means someone can’t get to work.

2 hours agolokar

That’s an incentive to reduce risk, but if you empirically show that the AV is even 10x safer, why wouldn’t you chalk that up as a win?

2 hours agocameldrv

Oddly I cannot decide if this is cause for damnation or celebration

Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

2 hours agoWarmWash

> Waymo hits a kid? Ban the tech immediately, obviously it needs more work.

> Waymo hits a kid? Well if it was a human driver the kid might well have been dead rather than bruised.

These can be true at the same time. Waymo is held to a significantly higher standard than human drivers.

2 hours agoFilligree

> Waymo is held to a significantly higher standard than human drivers.

They have to be, as a machine can not be held accountable for a decision.

2 hours agomicromacrofoot

The promise of self-driving cars being safer than human drivers is also kind of the whole selling point of the technology.

2 hours agoTeMPOraL

Sure, but the companies building them are just shoving billions of dollars into their ears so they don't have to answer "who's responsible when it kills someone?"

3 minutes agomicromacrofoot

What? No? The main selling point is eliminating costs for a human driver (by enabling people to safely do other things from their car, like answering emails or doomscrolling, or via robotaxis).

21 minutes agomyrmidon

Alternate headline: Waymo saves child's life

4 minutes agoxnx

A human driver would most likely have killed this child. That's what should be on the ledger.

2 hours agobpodgursky

That's pretty hyperbolic. At less than 20 mph, car vs pedestrial is unlikely to result in death. IIHS says [1] in an article about other things:

> As far as fatalities were concerned, pedestrians struck at 20 mph had only a 1% chance of dying from their injuries

Certainly, being struck at 6 mph rather than 17 mph is likely to result in a much better outcome for the pedestrian. And that should not be minimized; although it is valuable to consider the situation (when we have sufficient information) and validate Waymo's suggestion that the average human driver would also have struck the pedestrian and at greater speed. That may or may not be accurate, given the context of a busy school dropoff situation... many human drivers are extra cautious in that context and may not have reached that speed; depending on the end to end route, some human drivers would have avoided the street with the school all together based on the time, etc. It's certainly seems like a good result for the premise, child unexpectedly appears from between large parked vehicles, but maybe there should have been an expectation.

[1] https://www.iihs.org/news/detail/vehicle-height-compounds-da...

15 minutes agotoast0

For me, the policy question I want answered is if this was a human driver we would have a clear person to sue for liability and damages. For a computer, who is ultimately responsible in a situation where suing for compensation happens? Is it the company? An officer in the company? This creates a situation where a company can afford to bury litigants in costs to even sue, whereas a private driver would lean on their insurance.

2 hours agogortok

Waymo hits you -> you seek relief from Waymo's insurance company. Waymo's insurance premium go up. Waymo can weather a LOT of that. Business is still good. Thus, poor financial feedback loop. No real skin in the game.

John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

NOW ... add criminal fault due to driving decision or state of vehicle ... John goes to jail. Waymo? Still making money in the large. I'd like to see more skin in their game.

13 minutes agoemptybits

> John Smith hits you -> you seek relief from John's insurance company. John's insurance premium goes up. He can't afford that. Thus, effective financial feedback loop. Real skin in the game.

John probably (at least where I live) does not have insurance, maybe I could sue him, but he has no assets to speak of (especially if he is living out of his car), so I'm just going to pay a bunch of legal fees for nothing. He doesn't car, because he has no skin in the game. The state doesn't care, they aren't going to throw him in jail or even take away his license (if he has one), they aren't going to even impound his car.

Honestly, I'd much rather be hit by a Waymo than John.

8 minutes agoseanmcdirmid

So you're worried that instead of facing off against an insurance agency, the plantiff would be facing off against a private company? Doesn't seem like a huge difference to me

2 hours agojobs_throwaway

Is there actually any difference? I'd have though that the self-driving car would need to be insured to be allowed on the road, so in both cases you're going up against the insurance company rather than the actual owner.

2 hours agoentuno

Personally I'm a lot more interested in kids not dying than in making income for injury lawyers. But that's just me.

2 hours agobpodgursky

Your comment implies that they are less interested in kids not dying. Nowhere do they say that.

2 hours agorationalist

I'm not interested in the policy question.

2 hours agobpodgursky

Then don't reply??

That still doesn't excuse trying to make them look bad.

2 hours agorationalist

It was a reply to my comment.

2 hours agobpodgursky

No, "the ledger" should record actual facts, and not whatever fictional alternatives we imagine.

24 minutes agoboothby

Fact: This child's life was saved by the car being driven by a computer program instead of a human.

22 minutes agodirewolf20

Instead of a human who was driving exactly the same as the Waymo up until the instant the child ran out. Important distinction.

6 minutes agoNoGravitas

Disagree, most human drivers would notice they are near an elementary school with kids coming/going, crossing guard present, and been driving very carefully near blocked sight lines.

Better reporting would have asked real people the name of the elementary school, so we could see some pictures of the area. The link to NHTSA didn't point to the investigation, but it's under https://www.nhtsa.gov/search-safety-issues

"NHTSA is aware that the incident occurred within two blocks of a Santa Monica, CA elementary school during normal school drop off hours; that there were other children, a crossing guard, and several double-parked vehicles in the vicinity; and that the child ran across the street from behind a double parked SUV towards the school and was struck by the Waymo AV. Waymo reported that the child sustained minor injuries."

2 hours agoaxus

We're getting into hypotheticals but i will say in general i much much prefer being around Waymos/Zooxs/etc. than humans when riding a bicycle.

We're impatient emotional creatures. Sometimes when I'm on a bike the bike lane merges onto the road for a stretch, no choice but to take up a lane. I've had people accelerate behind me and screech the tyres, stopping just short of my back wheel in a threatening manner which they then did repeatedly as i ride the short distance in the lane before the bike lane re-opens.

To say "human drivers would notice they are near an elementary school" completely disregards the fuckwits that are out there on the road today. It disregards human nature. We've all seen people do shit like i describe above. It also disregards that every time i see an automated taxi it seems to drive on the cautious side already.

Give me the unemotional, infinite patience, drives very much on the cautious side automatic taxi over humans any day.

2 hours agoAnotherGoodName

Would have. Could Have. Should have.

Most humans would be halfway into other lane after seeing kids near the street.

Apologist see something different than me.

Perception.

2 hours agofrankharv
[deleted]
2 hours ago

Q: Why did the self-driving car cross the road?

A: It thought it saw a child on the other side.

2 hours agohenning

That's Tesla. Waymo seems mostly ok.

2 hours agodirewolf20

I’m actually pretty surprised Waymo as a general rule doesn’t completely avoid driving in school zones unless absolutely unavoidable.

Any accident is bad. But accidents involving children are especially bad.

2 hours agowhynotminot

That would be one hell of a convoluted route to avoid school zones. I wonder if it would even be possible for a large majority of routes, especially in residential areas.

2 hours agodylan604

It might not be possible for a lot of places — I don’t really know.

But I know when I drive, if it’s a route I’m familiar with, I’ll personally avoid school zones for this very reason: higher risk of catastrophe. But also it’s annoying to have to slow down so much.

Maybe this personal decision doesn’t really scale to all situations, but I’m surprised Waymo doesn’t attempt this. (Maybe they do and in this specific scenario it just wasn’t feasible)

2 hours agowhynotminot

Most people prefer the shortest ride. Circling around school zones would be the opposite of that. Rides are charged based on distance, so maybe this would interest Waymo, but one of the big complaints about taxi drivers was how drivers would "take them for a ride" to increase the fare.

an hour agodylan604

Seems like a solvable problem: make it clear on the app/interior car screens that a school zone is being avoided — I think most riders will understand this.

You also have to drive much more slowly in a school zone than you do on other routes, so depending on the detour, it may not even be that much longer of a drive.

At worst, maybe Waymo eats the cost difference involved in choosing a more expensive route. This certainly hits the bottom line, but there’s certainly also a business and reputational cost from “child hit by Waymo in school zone” in the headlines.

Again, this all seems very solvable.

an hour agowhynotminot

> The vehicle remained stopped, moved to the side of the road

How do you remain stopped but also move to the side of the road? Thats a contradiction. Just like Cruise.

2 hours agojoshribakoff

My reading of that is that they mean stopped the progression of the journey rather that made no movement whatsoever.

2 hours agocallumgare

I agree, it’s poorly worded but I think that’s what they mean.

I also assume a human took over (called the police, moved the car, etc) once it hit the kid.

2 hours agolokar

They mean the vehicle didn't drive away. It moved to the side of the road and then stopped and waited.