by pwnna on 5/16/24, 3:17 AM with 105 comments
by eggy on 5/16/24, 2:23 PM
24 hours is the time it takes the sun to return to the same spot in the sky due to earth having to rotate for another 3m56s to make up for angle gained by revolving around the sun in the same direction as the rotation of the Earth. This applies for the other planets that also rotate and revolve in the same direction - Mercury, Earth, Mars, Jupiter, Saturn, and Neptune. A sidereal day is 23h 56m 4.091s for distant stars to return to the same spot in the sky.
Damn, I knew that is why I botched my 6-stop exposure at my daughter's graduation! She can't blame me now! Thank you HN!
by t0mas88 on 5/16/24, 9:34 AM
It helps to have a rough indication of the current latitude on startup, but you can also figure it out from the gyro outputs. Just takes longer.
With modern sensors (solid state laser gyroscopes) it has all become a lot smaller so if you really want to you can do this in a camera. It's just probably going to be too expensive for what it brings, because 6+ stops of stabilisation is a lot already.
by isoprophlex on 5/16/24, 8:39 AM
So, basically dieselgate but for image stabilization
by moi2388 on 5/16/24, 12:40 PM
The image with the 2 earths.. that only works if the camera is not also on the ground, but it is? How is the rotation of the object and the camera not identical? Why would it rotate ‘upwards’?
Also, if the issue is relative motion or rotation between camera and object, wouldn’t two sensors, one on the camera and one on the subject be able to solve this, since we can see if their rotations/movement match up or not?
by DoctorOetker on 5/16/24, 8:55 AM
you can back calculate orientations with high pass filterd gyro data, to rotate the unfiltered gyro date into the current reference frame, then low pass the unfiltered but rotation corrected gyro data to get the earth rotation axis in the current reference frame, then one can estimate the expected rotation that should be ignored.
by chris_va on 5/16/24, 6:14 PM
In theory, you can take the last N seconds of data from the gyroscope (I assume it is running while the camera is active) to get the overall drift, even if it is tumbling around for a while before being pointed at the subject... Assuming the tumbling has enough periods of time that are correlated with the earth's rotation (e.g. someone carrying it, not pointing it an an aircraft or something moving EW for the window duration that is anticorrelated with the rotation).
by Asraelite on 5/16/24, 10:14 AM
These seem trivial to work around. Just store the last known position and use that. It's rare that you'll be without a GPS signal or beside a magnet, and you certainly won't be traveling long distances in those conditions. And since when do magnets block GPS signals?
by _ph_ on 5/16/24, 3:38 PM
by sib on 5/16/24, 4:35 PM
https://www.nikonusa.com/p/z-f/1761/overview
("Based on CIPA standards; when using the telephoto end of the NIKKOR Z 24-120mm f/4 S" - for clarity, that lens does not have optical VR in the lens itself, so this is all based on in-body stabilization.)
by GuB-42 on 5/16/24, 11:24 AM
by mrandish on 5/16/24, 6:59 PM
by cesaref on 5/16/24, 4:06 PM
by Delmololo on 5/16/24, 1:58 PM
Right?
Might just not be practical at all.
On the other hand, shouldn't the earth rotate fast enough to figure this out in a short timeframe while the photographer starts looking through the finder?
by somat on 5/17/24, 12:29 AM
I am probably missing something huge. But if the goal is a stable image why use gyros. use the image itself to apply the correction factor to the final integration. sort of the same way videos are stabilized.
by gwill on 5/16/24, 2:17 PM
by tetris11 on 5/16/24, 7:23 AM
by aidenn0 on 5/16/24, 5:09 PM
by quonn on 5/16/24, 7:43 AM
by mikewarot on 5/16/24, 7:14 PM
You can stabilize out everything and account for the rotation by simply watching the vector of gravity over time.
by kybernetyk on 5/17/24, 11:48 AM
by kqr on 5/16/24, 7:08 AM
by imglorp on 5/16/24, 1:33 PM
by pixelpoet on 5/16/24, 4:03 PM