from Hacker News

Night Sight for Pixel phones

by mattbessey on 10/25/18, 3:41 PM with 131 comments

  • by dhruvp on 10/25/18, 7:55 PM

    If you're interested in learning more about this stuff the head of the Pixel team (Marc Levoy - Prof emeritus at Stanford) has an entire lecture series on this stuff from a class he ran at Google. They are here along with lecture notes:https://sites.google.com/site/marclevoylectures/home

    What's really cool is you can see him talk about a lot of these ideas well before they made it into the Pixel phone

  • by londons_explore on 10/25/18, 4:18 PM

    Prior work at Google Research before it made it into the product:

    https://ai.googleblog.com/2017/04/experimental-nighttime-pho...

    And by the original researcher in 2016:

    https://www.youtube.com/watch?v=S7lbnMd56Ys

  • by nkoren on 10/25/18, 9:40 PM

    What the Pixel cameras are doing is staggeringly good. My father is the founder of https://www.imatest.com/, and has a substantial collection of top-end cameras. He's probably in the top 0.0001% of image quality nerds. But most of the time, he's now entirely happy shooting on his Pixel.
  • by rdtsc on 10/25/18, 7:46 PM

    > Google says that its machine learning detects what objects are in the frame, and the camera is smart enough to know what color they are supposed to have.

    That is absolutely impressive.

    The color and text on the fire extinguishers along with the texture detail seen in the headphones in the last picture are just stunning. Congratulations to anyone who worked on this project!

  • by londons_explore on 10/25/18, 4:25 PM

    I would like super sensitive cameras like this to be used inside fridges to see the very faint glow of food going off.

    Chemical reactions by bacteria breaking down food produce light, enough for humans to see in only the darkest of places (if you live in a city, you won't ever encounter dark enough situations).

    A camera simulating a 1 hour exposure time in a closed refrigerator ought to be able to see it pretty easily.

  • by fuddle on 10/25/18, 11:06 PM

    This reminds me of a similar project: "Learning to See in the Dark". They used a fully-convolutional network trained on short-exposure night-time images and corresponding long-exposure reference images. Their results look quite similar to the Pixel photos.

    http://cchen156.web.engr.illinois.edu/SID.html

  • by londons_explore on 10/25/18, 4:23 PM

    It's notable that this 'accumulation' method effectively lets you have a near-infinite exposure time, as long as objects in the video frame are trackable (ie. there is sufficient light in each frame to see at least something).

    I'd be interested to see how night mode performs when objects in the frame are moving (it should work fine, since it will track the object), or changing (for example, turning pages of a book - I wouldn't expect it to work in that case).

  • by dannyw on 10/26/18, 4:59 AM

    Damn. I’ve had an iPhone since the 3G, but this is really tempting me to get a pixel.
  • by whoisjuan on 10/26/18, 12:39 AM

    Damn! That's honestly impressive! I started reading thinking it was going to be a simple brightness-up kind of thing, but it's incredible how they are able to recreate the whole photography based on an initial dark raw input.

    I must imagine that the sensor is doing an extra but un-perceptible long exposure than then is used to correct the lightning of the dark version.

  • by tjr225 on 10/26/18, 4:57 AM

    This might be a weird criticism but... making photos taken in the dark look like they are not actually dark seems kind of like a weird thing to do? I've struggled with my micro 4/3 camera to capture accurate night photographs, but the last thing I wanted of them was to be brighter than I was perceiving them to be.

    That said, the effect of some of these photographs is striking, and I'm sure the tech is interesting.

  • by pavel_lishin on 10/25/18, 9:12 PM

    Prepare for a lot of cute sleeping baby photos on your feeds, folks. That's what I'll be using this for.
  • by woolvalley on 10/26/18, 1:03 AM

    Now if we could only get this on APS-C & 1" compacts like the sony rx100 or fujifilm xf10. With first class smartphone integration and networking.
  • by lostmsu on 10/25/18, 7:46 PM

    Any hardware reason for it to only work on Pixel phones?
  • by Erwin on 10/25/18, 8:31 PM

    The Huawei P20 shipped out in April with this feature -- I look forward to dxomark's analysis of the Pixel 3 phone compared to the P20, which currently remains on top: https://www.dxomark.com/category/mobile-reviews/

    Upgrading from a 3-year old Samsung S6, where I could almost see the battery percentages drop off percent by percent, the P20 Pro's 4000 mAh battery has been great (too bad the wireless charging didn't appear until the new Mate P20 Pro).

  • by gingerbread-man on 10/26/18, 2:06 AM

    Kind of a tangent, but it was really cool to see a picture of the author's Schiit Jotunheim headphone amp in the article. One of the founders wrote an amazing book on building a hardware startup: http://lucasbosch.de/schiit/jason-stoddard-shiit-happened-ta....
  • by bwang29 on 10/25/18, 10:00 PM

    The biggest challenge to do this technically is to use gyroscope data to work together with the stacking algorithm. It's hard to tune the gyro to work great for any phone. A pure software solution to analyze the perspectives transformation would be too slow.
  • by jakobegger on 10/25/18, 10:10 PM

    All those shots look amazing, but they're of stationary objects.

    I really want to know how that works for people! 99% of photos I take are of people, and the lighting is always bad.

    Are there any photos of people?

  • by polskibus on 10/25/18, 8:29 PM

    How does Pixel implementation of low light photography differ from Samsung? Are they comparable in photo quality?
  • by swaggyBoatswain on 10/25/18, 8:48 PM

    Wouldn't video still be extremely blurry? This is mostly for things not moving / pictures

    I wonder if this technology will eventually supercede military night vision goggles. Having the ability to add color perception at long distances could have useful for identifying things at night.

  • by golfer on 10/25/18, 9:03 PM

    Why not use the actual article title?

    "Google’s Night Sight for Pixel phones will amaze you"

  • by hammock on 10/25/18, 10:53 PM

    How are you going to do a review of Night Sight and not even go outside? Every photo just taken in a room with the lights turned off. Come on, man. Tell your editor he needs to wait until nightfall.
  • by yanonymous2 on 10/26/18, 2:07 PM

    That's great, but we should find a different name for "photos" that change image information in the process.
  • by endorphone on 10/25/18, 11:28 PM

    Interesting, but a tad rich with puffery.

    Pre-OIS Google did this with image stacking which was a ghetto version of a long exposure (stacking many short exposure photos, correcting the offsets via the gyro, was necessary to compensate for inevitable camera shake). There is nothing new or novel about image stacking or long exposures.

    What are they doing here? Most likely it's simply enabling OIS and enabling longer exposures than normal (note the smooth motion blur of moving objects, which is nothing more than a long exposure), and then doing noise removal. There are zero camera makers who are flipping their desks over this. It is usually a "pro" hidden feature because in the real world subjects move during long exposure and shooters are just unhappy with the result.

    The contrived hype around the Pixel's "computational photography" (which seems more incredible in theory than in the actual world) has reached an absurd level, and the astroturfing is just absurd.