this post was submitted on 10 Nov 2023
1 points (100.0% liked)
Photography
24 readers
1 users here now
A place to politely discuss the tools, technique and culture of photography.
This is not a good place to simply share cool photos/videos or promote your own work and projects, but rather a place to discuss photography as an art and post things that would be of interest to other photographers.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A depth of field calculator is a good start.
But I've found more success with Merklinger's method, which takes into account the blur of objects in real life instead of on the sensor. So it is an "object space" method instead of an "image space" method for determining depth of field.
Basically, for landscapes, wide deep landscapes where infinity needs to be sharp, you just focus on infinity. Easy, right?
Then you identify the smallest object in your scene, in the foreground, that you want just barely resolved. Suppose this is a blade of grass 5 mm wide. As it so happens, the diameter of the entrance pupil of a lens puts a lower limit on the size of things that can be resolved, anywhere in the scene when focused at infinity, and the entrance pupil width is the focal length divided by the f-number, by definition. You divide the focal length by the size of object that you want barely resolved, and that gives you the f-number needed. So if you are using a 50 mm lens, and want to barely resolve a blade 5 mm wide, you'll calculate 50 mm / 5 mm = 10, or f/10. This is so easy you can do it in your head. This is true no matter how near or far the object is from the camera; you simply don't have to calculate distances, and it doesn't matter what camera you have and what size sensor you have.
This also works if you focus closer, except the 5 mm blade of grass will be sharper the closer it is to the focus distance, but you can calculate that as well: 2.5 mm will be resolved at half the camera-subject distance, and 1 mm will be resolved at when you are 4/5s of the way to the subject, and if you want to estimate blur behind the point of focus, the same thing happens in reverse, so 1/5 of the distance beyond the point of focus will resolve 1 mm as well, and twice the focus distance will again resolve 5 mm, and will increase proportionally out to infinity.
Unfortunately, diffraction effects aren't easily incorporated into depth of field equations, but diffraction blur is the same width on the sensor for all cameras and all lenses at the same f/stop, which ultimately means that the f-number you can use with equal diffraction is proportional to the sensor width. Simply determine what f/stop on one lens and one camera is too blurry for you, and from that one observation you can extrapolate it to all of your gear. If f/16 looks a bit too ugly on your full frame camera, then you'll know that f/8 will be too ugly on your Micro 4/3rds camera, since the latter camera has half the sensor width. However, diffraction blur responds well to sharpening, so you can usually go beyond the so-called diffraction limit if you are willing to pump up the sharpening.
http://www.trenholm.org/hmmerk/TIAOOFe.pdf