from Hacker News

Police chief said Uber victim “came from the shadows” – don’t believe it

by drabiega on 3/23/18, 1:12 PM with 204 comments

  • by pwthornton on 3/23/18, 1:47 PM

    If Uber's self-driving cars are no better than the visual acuity of humans, they shouldn't be on the road. There is no point to this at all if we can't make robots better than humans.

    Self-driving cars should be able to use sensors that pick up stuff better than human eyes.

    I can't believe that radar and LIDAR were completely unable to see this woman until the visual cameras picked her up. This seems like either a serious flaw with the sensors or with software. The driver, despite looking down at a phone, still seemed to react faster than the computer!

    I'm not that comfortable with Uber doing self-driving cars. I don't really consider them a true tech company. They are not built on world-class engineering and design. They are largely a company built by getting around legal regulations and getting rid of staff workers. They've done incredible legal work with regulatory environments.

    A company like Google, I think they have the talent and the culture to build something really good here. A company like GM would understand the stakes at hand here and would be cautious. Uber just doesn't seem to have the talent, mission or ethics to be in the self-driving car business. It's no surprise they are the first company to kill a pedestrian.

  • by daveguy on 3/23/18, 1:50 PM

    I'm sure the Uber footage is either extremely low dynamic range (and not similar to human vision) or was modified to make it look darker.

    However, the driving footage is from after release of the story. The moon on the 21st had illumination of ~17%. The night of the accident was the day after the new moon with an illumination of 1%. Someone needs to film the same section sometime between April 15 - April 17 to get a more accurate estimate of the light on that night. I expect video will still show much lighter conditions than that shown on the suspiciously dark video, but without the same moon lighting conditions we are comparing apples and oranges.

    (accident) http://www.moongiant.com/phase/3/18/2018

    (filmed) http://www.moongiant.com/phase/3/21/2018

    EDIT: masklinn pointed out the accident occured before the stoplight in the much more lighted area, not later area where the news crew was filming and was commented on in the second video. It was near the stoplight with the parking deck in the background rather than only the long spaced overhead lights. Incredibly misleading video from the dashcam. Maybe good to know for defense of the true conditions, but it probably won't make that much difference. look at the slideshow

  • by antonkm on 3/23/18, 1:45 PM

    I think a big problem with this whole debacle is that we're discussing cameras and lighting, before discussing the actual matter: a person died and someone is responsible.

    It doesn't matter if I were to run someone over when it's dark. I would still be accountable.

    The discussion about footage removes focus from the actual issue which should be a legal issue, not a technical debugging session.

    Edit: Sorry, am not native English speaker. What I meant when I wrote accountable was that I would have to own up to what happened. Not that I would get a sentence if it was an accident.

  • by TheCapeGreek on 3/23/18, 1:28 PM

    Everyone who saw the video has said it's the fault of camera for seeming so dark (since even the street lights have almost no effect anywhere), and regardless of that AVs work with lidar and such. Where is the telemetry and analysis? If it didn't see the pedestrian, then why?
  • by Steltek on 3/23/18, 1:38 PM

    For reference, I found this video over on reddit (from /u/ghdana) of the area. It looks _very_ different with even a cell phone camera.

    https://www.youtube.com/watch?v=1XOVxSCG8u0

    Edit: I guess this video is also referenced in the article, after Brian's.

  • by marchenko on 3/23/18, 1:28 PM

    I'm not trying to be inflammatory, but should we expect the Tempe AZ City administration to be a neutral source of information in this case? Might they be biased by their decision to host this experiment in their jurisdiction (in the sense of anticipating criticism)?
  • by Humdeee on 3/23/18, 1:50 PM

    I'm surprised to hear a top police authority say this. I come from a family of police and it's always said that when telling your side of the story in a traffic incident, saying something along the lines of:

    "But Officer, they came out of nowhere!"

    never really holds up too well for you. It's the same as saying "I flat out didn't see them, so not my fault." It's quite the opposite in fact... and any cool-headed person will pick that up. I'm sure the investigation is well underway in any event.

  • by JustSomeNobody on 3/23/18, 2:33 PM

    https://www.nhtsa.gov/risky-driving/speeding

    > Speed also affects your safety even when you are driving at the speed limit but too fast for road conditions, such as during bad weather, when a road is under repair, or in an area at night that isn’t well lit.

    This vehicle was speeding. Uber is at fault. How is this disputable?

    Edit: One More link.

    https://www.123driving.com/dmv/drivers-handbook-speed-limits

    > You will need to drive with extra care at night. You cannot see as far ahead or to the side, and glare from oncoming cars can reduce your vision even more. Follow these guidelines for driving at night:

        Use your headlights (low beam or high beam) between the hours of sunset and sunrise.
        Low beam headlamps are only effective for speeds up to 20-25 MPH. You must use special care when driving faster than these speeds, since you are unable to detect pedestrians, bicyclists and others.
        High beam headlights can reveal objects up to a distance of at least 450 feet and are most effective for speeds faster than 25 MPH.
        Don't use high-beam headlights within 500 feet of oncoming vehicles.
        If you are behind other vehicles, use low beams when you are within 300 feet of the vehicle ahead.
        When leaving a brightly lit place, drive slowly until your eyes adjust to the darkness.
        If a vehicle comes toward you with high beams, flash your lights to high beam and back to low beam once.
        Don't look directly at oncoming headlights. Instead, watch the right edge of your lane. Look quickly to be sure of the other vehicle's position every few seconds.
        Drive as far to the right as you can if a vehicle with one light comes toward you.
  • by jacksmith21006 on 3/23/18, 1:43 PM

    Would love to see how Google would handle this situation. Have to believe their algorithm would be smarter and handle.

    Remember Google had to deal with bike corner cases a couple of years ago.

    "A Cyclist's Track Stand Befuddled One of Google's Self-Driving Cars"

    https://gizmodo.com/a-cyclists-track-stand-totally-befuddled...

  • by avs733 on 3/23/18, 3:05 PM

    There is another aspect of this as well that is important to consider.

    I will 100% bet that the police had to interact with/rely on Uber to provide them with the video and data. With Uber's history, and the importance of this to the company, can we trust they didn't manipulate the video (i.e., make it darker)?. In the end, as much distaste as I have for Uber, this isn't even about them...its about process and chain of evidence.

    In a non-autonomous accident if there was a fight between two opposing drivers, the vehicle manufacturer is a neutral party and it seems reasonable to rely on them to extract and hand over data. If, however, the fight was between the driver and the manufacturer (e.g., the driver asserted a failure of the vehicle) there is no way that the manufacturer would be allowed to extract the vehicles' data logger. It would be done by an independent third party.

    As these accidents happen (and no matter your perspective on this one/cars/autonomous vs. human drivers/etc. - they will happen) there needs to be methodology for extracting data and assessing it from the vehicles that does not rely on the manufacturer or any other conflict of interest party.

    Further, this hole debacle shows how important it is that experts be involved in these decisions, discussions. The police chief has no knowledge or experience with the technical details of autonomous vehicles. His statements have, from the beginning, been irresponsible and inappropriately deferential to Uber at the expense of a citizen of his town.

    The rush for LIDAR/Radar technology makes clear just how much of autonomous tech relies on the nonvisual range. However, the chief obviously...based on his statements...is judging performance entirely on the visual range. He is not an expert. As an ex-Tempe resident, I doubt Tempe has independent autonomous vehicle experts. I would almost guarantee they are relying on the companies being truthful and open with them.

    That is bad.

  • by dekhn on 3/23/18, 1:50 PM

    I'm gonna wait for the NTSB report because they are not armchair journalists citing misunderstood optics laws.
  • by devy on 3/23/18, 3:08 PM

    The takeaways from this tragic accident for me are:

    1. We need to have better suited (much better low light performance) frontal cameras equipped for self-driving cars and dashcams. Be it a dual sensor/lens setup or whatnot, if Apple can make it for iPhone X for $35 [1], the technology is available now and shippable now. Compare to the cost of tens of thousands of the price of a car and hundreds/thousands for the price of packages/options manufacturers to add on to a new car, that's virtually nothing.

    2. We need better more sophisticated headlamps and new laws to better suited for today's technology advancements[2] - something like a 3rd mode of wide beam on top of hi/low bean modes. From ubiquitous adaptive headlights to matrix laser beams, these improvement can increase driver visibility and perhaps save lives in situations like this.

    [1] http://www.businessinsider.com/iphone-x-teardown-parts-cost-...

    [2] https://blog.dupontregistry.com/mercedes-benz/why-are-adapti...

  • by logotype on 3/23/18, 2:01 PM

    Why do we only have the dashcam videos? I want to see the LIDAR data. Also, Uber definitively has ways to render all the sensor data into a video which shows the objects which was detected (or lack of detection).
  • by 2aa07e2 on 3/23/18, 1:52 PM

    The car driver was half the time not looking at the road! Who the hell drives like that.

    Jaywalking or not, the victim was clearly a pedestrian (not riding her bike), the insurance of Uber (Uber itself?) should compensate the victim's relatives. If it doesn't I expect a huge backslash.

  • by anotheryou on 3/23/18, 1:46 PM

    Maybe the took the mid-range from a HDR* video and published it? This way they technically did not add contrast, but just conveniently dropped all the info in the shadows to "make it accessible"

    * not tone-mapped, I mean what HDR truly stands for

  • by thrillgore on 3/23/18, 2:00 PM

    Is Uber paying off the police chief to blame the victim?

    I'm with the majority here -- the LIDAR should have picked the pedestrian up and decelerated. Uber's self-driving tech is less safe than a human driver, and should be shelved.

  • by rplnt on 3/23/18, 2:01 PM

    There needs to be a process in place for autonomous cars accidents. Required telemetry and data provided to independent organization (financed by manufacturers). If it deems the car (hardware or software) was at fault, whole fleet needs to be immediately disabled until the issue is resolved.

    Dead pedestrian here and there won't hurt the companies financially. There's not much reason for them to push for perfect products. They can just shoot for 99% and account for the rest in damages. Disabling their service will hurt them.

    This needs to be treated like planes, trains, or medical devices are. Not like consumer device.

  • by typeformer on 3/23/18, 2:09 PM

    I think it was extremely irresponsible of the officers on the scene to imply that there was "nothing that could be done" or a lack of fault before a real investigation had even started!
  • by scrumper on 3/23/18, 3:12 PM

    The concept of a safety driver in SDC testing needs refining.

    The guy here was clearly distracted. He only looks up at the road a fraction of a second before the crash.

    My bet is that he'll be prosecuted as if he were a distracted driver: he's responsible for the safe operation of the vehicle (otherwise, why's he there in the first place?). He'll go to prison, probably, and rules will change very quickly as a result.

    Driving a car is an active process, it's fatiguing but not inherently boring (excepting really long, straight, empty roads through unchanging scenery). Sitting in a SDC while it drives itself however is boring: there's a well-researched and well-understood attention deficit problem which nobody seems to be discussing here.

    It's the same thing that makes TSA security screening such a tough thing to get right, or sentry duty: you can't expect humans to sit for hours passively monitoring for unpredictable and rare events that they then have to react decisively to. Brains don't work like that. These safety drivers need short shifts with frequent breaks, they need a partner, and they need an active background task that keeps their eyes up and forward and their brain engaged (giving a commentary, for example).

  • by pwaivers on 3/23/18, 1:53 PM

    They should release the LIDAR sensor data, so we can learn what the car "saw". This dashcash is basically irrelevant compared to the sensors on the car.
  • by justspamjustin on 3/23/18, 2:05 PM

    When I learned to drive, I was taught to slow down when you see that you’re coming up on lower visibility areas of the road. This could be a turn in the road, a part of the road that reduces slope or a darker part of the road. I still do practice this safe precaution. And while the technology is in its infancy, drivers in an autonomous car should be just as attentive as a driver in any other car.
  • by mping on 3/23/18, 2:55 PM

    What seems horrible from a SW engineering point of view is that they probably didn't QA this scenario. I mean, it's not a corner case, it's basically the scenario where someone appears in front of the car "all of the sudden".

    I just hope engineers working on software that can cause serious injury or kill someone do the appropriate diligences to ensure this doesn't happen again.

  • by ironjunkie on 3/23/18, 1:55 PM

    I don't want to sound like a crazy person here, but could it be that the video released was edited to look super dark? The difference between the Uber video and the amateur cellphone videos is striking!

    Could it be that the police//Uber employees//municipality have a political goal to make this look like it was unavoidable? It looks so to me.

  • by bitL on 3/23/18, 2:58 PM

    It's pretty common to ramp up contrast in cam used for self-driving inputs for basic computer vision to make lane markers pop up. So such a camera won't have much chance of detecting a person casually crossing road without any concerns for traffic. The crash points more to failure in sensor fusion; LiDAR should have detected it, but clustering algorithm might have either removed it, or she appeared in too few scanning frames to be reliably detected as an object, due to heat map thresholding to avoid false positives.
  • by boyaka on 3/23/18, 2:13 PM

    Most of the illumination when I drive comes from my headlights. All of it that I need, no matter how dark or "shadowy" it is (give me a break).
  • by TearsInTheRain on 3/23/18, 5:06 PM

    A lot of people are focusing on why the car didn't stop or even slow, but what about the human driver? I feel like this practice of having a human driver behind the wheel doesn't work as well as we might think it does. It is likely giving people and legislators a false sense of security if they think a human has the ability to intervene properly to stop an accident
  • by cesarb on 3/23/18, 4:02 PM

    Is it possible that the released video was so dark because the dashcam was behind a tinted windshield? The other dashcam videos from the same place might be from cars with non-tinted windshields. Of course, any video cameras used as self-driving sensors would be outside the windshield, so their view would be clearer.
  • by jaragones on 3/23/18, 2:04 PM

    I wonder why they don't want to show its LIDAR "video", I thought that all autonomous vehicle are using it. I agree that a cause of the accident can be that LIDAR had failed but I wonder why in this kind of technologies they don't have redundancies system. :/
  • by sylvinus on 3/23/18, 3:08 PM

    I understand people are focusing on the video because that's all we have at the moment, but it entirely misses the point: this is primarily a LIDAR failure. This should have been prevented even with the cameras off.
  • by sschueller on 3/23/18, 3:03 PM

    Maybe there should be some limits to when these vehicles may travel until they have proven they can deal with more difficult situations.

    Driving at night or in a snow storm is a very difference environment than on a nice day.

  • by lafar6502 on 3/23/18, 2:19 PM

    We’ll miss the good old times where road accidents were a matter settled between two humans on equal rights, no a individual vs multi-billion corps and their army of lawyers
  • by dreta on 3/23/18, 3:18 PM

    If the car is at fault here, then Uber should bare the full consequences.

    At the same time though, the person with the bike had no reflective markings, wasn’t blind or anything, and was crossing the road without any respect for the incoming traffic at an unmarked spot. They didn’t even flinch when the car was about to hit them. The guy in the car looked away for like 5 seconds. He clearly wasn’t absent-minded. If the visibility wasn’t that bad, any suspicious activity on the left would have grabbed his attention.

  • by perfmode on 3/23/18, 1:37 PM

    Where was the LIDAR?
  • by gamblor956 on 3/23/18, 1:43 PM

    The Uber victim did come from the shadows. Arizona gets dark at night, especially during a new moon. The question isn't why the optical cameras didn't see her, it's why did the LIDAR failed to detect her?
  • by bhouston on 3/23/18, 1:36 PM

    The videos that show everything super bright look like they have their high beams on, whereas the Uber video looks like the standard headline mode. Although I can not be completely sure. If that is the case it is apples-to-oranges.