from Hacker News

Earth rotation limits in-body image stabilization to 6.3 stops (2020)

by pwnna on 5/16/24, 3:17 AM with 105 comments

  • by eggy on 5/16/24, 2:23 PM

    Well, if we're nitpicking here, it is not 86,000s/day (24 hours * 3600s/hour) and 7.27x10^-5 radians/s, but 86,164.091s and 7.29x10^-5 radians/s.

    24 hours is the time it takes the sun to return to the same spot in the sky due to earth having to rotate for another 3m56s to make up for angle gained by revolving around the sun in the same direction as the rotation of the Earth. This applies for the other planets that also rotate and revolve in the same direction - Mercury, Earth, Mars, Jupiter, Saturn, and Neptune. A sidereal day is 23h 56m 4.091s for distant stars to return to the same spot in the sky.

    Damn, I knew that is why I botched my 6-stop exposure at my daughter's graduation! She can't blame me now! Thank you HN!

  • by t0mas88 on 5/16/24, 9:34 AM

    You don't need GPS to figure out the correction for this. Inertial navigation systems in aircraft (which use very stabilised platforms with a lot of math involved) worked before GPS was available.

    It helps to have a rough indication of the current latitude on startup, but you can also figure it out from the gyro outputs. Just takes longer.

    With modern sensors (solid state laser gyroscopes) it has all become a lot smaller so if you really want to you can do this in a camera. It's just probably going to be too expensive for what it brings, because 6+ stops of stabilisation is a lot already.

  • by isoprophlex on 5/16/24, 8:39 AM

    > The second solution is much more plausible, but still very difficult. The user would have to be pointing the camera at the subject for long enough such that the drift in their aim at the subject is smaller than the drift from the rotation of the earth. This is also implausible. What is concerning though, is that this second method is one that could work very well to cancel out Earth’s rotation on the CIPA specified stabilization test apparatus.

    So, basically dieselgate but for image stabilization

  • by moi2388 on 5/16/24, 12:40 PM

    Can somebody ELI5 this to me?

    The image with the 2 earths.. that only works if the camera is not also on the ground, but it is? How is the rotation of the object and the camera not identical? Why would it rotate ‘upwards’?

    Also, if the issue is relative motion or rotation between camera and object, wouldn’t two sensors, one on the camera and one on the subject be able to solve this, since we can see if their rotations/movement match up or not?

  • by DoctorOetker on 5/16/24, 8:55 AM

    This can be fixed in software:

    you can back calculate orientations with high pass filterd gyro data, to rotate the unfiltered gyro date into the current reference frame, then low pass the unfiltered but rotation corrected gyro data to get the earth rotation axis in the current reference frame, then one can estimate the expected rotation that should be ignored.

  • by chris_va on 5/16/24, 6:14 PM

    Solution (2) as written seems to imply that the camera can only use the gyroscope signal while the camera is pointed at the subject, but I cannot see why that is a strong limitation.

    In theory, you can take the last N seconds of data from the gyroscope (I assume it is running while the camera is active) to get the overall drift, even if it is tumbling around for a while before being pointed at the subject... Assuming the tumbling has enough periods of time that are correlated with the earth's rotation (e.g. someone carrying it, not pointing it an an aircraft or something moving EW for the window duration that is anticorrelated with the rotation).

  • by Asraelite on 5/16/24, 10:14 AM

    > The first isn’t a good solution for many reasons. Don’t have GPS signal? Shooting next to a magnet? Your system won’t work.

    These seem trivial to work around. Just store the last known position and use that. It's rare that you'll be without a GPS signal or beside a magnet, and you certainly won't be traveling long distances in those conditions. And since when do magnets block GPS signals?

  • by _ph_ on 5/16/24, 3:38 PM

    Version 2 sounds to me as the probably reason for the ability of Cameras like the OM1-2 to go over 8 stops. Yes, it is probably not a simple task to measure the earths drift with the gyroscopes, but there is one thing that might help: the frequency of that drift is exactly known - it is the speed of the earths rotation. So it should be possible to tune a very narrow filter to that frequency and only analyze the gyroscope signal for that frequency. With that one could at least partially compensate for the drift.
  • by sib on 5/16/24, 4:35 PM

    Nikon claims 8.0 stops of "VR image stabilization" for their Zf camera (released late in 2023).

    https://www.nikonusa.com/p/z-f/1761/overview

    ("Based on CIPA standards; when using the telephoto end of the NIKKOR Z 24-120mm f/4 S" - for clarity, that lens does not have optical VR in the lens itself, so this is all based on in-body stabilization.)

  • by GuB-42 on 5/16/24, 11:24 AM

    On the other hand, that should be awesome for astrophotography.
  • by mrandish on 5/16/24, 6:59 PM

    Perhaps in some camera firmware bug database there's a closed bug marked: "Won't fix. Tested working in orbit."
  • by cesaref on 5/16/24, 4:06 PM

    This is analogous to astro-photography problems with keeping stars as points rather than as blurred lines in long exposures. If you think about it, if a long exposure at night has a static landscape but moving stars, the IBIS equivalent would have static stars and a moving landscape :)
  • by Delmololo on 5/16/24, 1:58 PM

    You should be able to calculate it out by telling the user to press a button and after this, not rotating the camera away.

    Right?

    Might just not be practical at all.

    On the other hand, shouldn't the earth rotate fast enough to figure this out in a short timeframe while the photographer starts looking through the finder?

  • by somat on 5/17/24, 12:29 AM

    Why not stabilize optically?

    I am probably missing something huge. But if the goal is a stable image why use gyros. use the image itself to apply the correction factor to the final integration. sort of the same way videos are stabilized.

  • by gwill on 5/16/24, 2:17 PM

    i'm curious how the OM-1 MK2 gets around this to achieve 8.5 stops.

    https://explore.omsystem.com/us/en/om-1-mark-ii

  • by tetris11 on 5/16/24, 7:23 AM

    I still don't quite follow the explanation. The duck and I are on the surface of the same body and are rotating together, maintaining a constant distance... why does Earth rotation need to be corrected for?
  • by aidenn0 on 5/16/24, 5:09 PM

    You should be able to exceed 6.3 stops if you are pointing north/south rather than east/west, right? Maybe they are just measuring it pointing north/south.
  • by quonn on 5/16/24, 7:43 AM

    Would it be possible to correct for the rotation by counter rotating if the orientation of the camera is known (or determined by GPS + compass)?
  • by mikewarot on 5/16/24, 7:14 PM

    Bullshit. It's ITAR, they don't want parts floating around in the world that can make a dead nuts accurate INS - inertial navigation system, as this enables weapons we don't want in the wild.

    You can stabilize out everything and account for the rotation by simply watching the vector of gravity over time.

  • by kybernetyk on 5/17/24, 11:48 AM

    Nikon has 8 stops so they somehow beat physics
  • by kqr on 5/16/24, 7:08 AM

    6.3 stops is a lot, though. That's basically the fully usable aperture range of a kit zoom lens.
  • by imglorp on 5/16/24, 1:33 PM

    Is a plain phone gyroscope enough to detect Earth rotation? Is there an app for that?
  • by pixelpoet on 5/16/24, 4:03 PM

    Yet another example of b0rked / unescaped TeX, specifically log vs \log in this case. Blows my mind that nobody sees it...