What Matters Most?

What quality in a lens should you look for first?

Most people tend to look for a lens in terms of its focal range and features first. That's because these things define the "function" of the lens. If you need a wide angle lens for your DX body, you don't shop for 300mm lenses ;~). In general, focal range and features tend to be the things that narrow your choices down to a few lenses. Once you're down to a handful of lenses, it becomes the optical qualities that are the attributes that finalize your choice. And that's what we're looking at in this article: what optical qualities matter most?

First, let's get our players identified. In terms of optical quality we have perhaps six major things to compare potential lenses on:

  • chromatic aberration
  • linear distortion
  • vignetting
  • sharpness
  • bokeh
  • flare

Let's make this a little easier: most vignetting, lateral chromatic aberration, and linear distortion can usually be corrected in post processing. More and more cameras and raw converters are just recognizing these things and correcting for them. Within a few more years it is likely going to be the odd product that doesn't correct for them. 

Lateral chromatic aberration (CA) can usually be corrected on recognition. In other words, software can simply look at edges, see whether it's present, and figure out what parameters to use to correct it. It shouldn't make a lot of difference how much chromatic aberration is present, either. Large amounts are almost as easy to correct with those algorithms as small. 

Indeed, most Nikon DSLR bodies starting with the D3/D300 automatically do lateral chromatic aberration corrections in JPEGs, and Nikon Capture NX-D can remove them, as well. DxO, Lightroom, and Photoshop can remove chromatic aberration, too. So unless you're using an oddball workflow that has no ability to do CA removal at some stage, you shouldn't be too worried about CA in a lens. Yes, if your workflow's CA removal is manual rather than automatic, it adds a step to your work, but it's not a problematic one.

Vignetting is much like CA: many modern Nikon DSLRs can "remove it" and we've got plenty of software that tackles it, as well. I put "remove it" in quotes here because right now most of the vignette removal is either "pick a broad setting from a list" or "brute force." The pattern of vignetting can be very subtly different. Some lenses have a highly centered bright spot with a "hard" edge, others have very dispersed bright spots with a very gradual falloff into the corners. Some zoom lenses even change characteristics as you zoom. 

Nikon's built-into-the-camera solution is crude: you select one of three levels of "correction." Adobe's software now gives you control over more factors, allowing you to better match the correction against the exact spread of the vignetting. But almost all vignetting solutions are manual still, and almost none are precise. Again, this introduces another workflow step, but again, it's not a terribly problematic one. Not-quite-corrected is better than not-corrected-at-all, and not-quite-corrected is often more than enough for most users.

Linear distortion is another step down the rung. We do have a few tools that allow us to correct it, but the best ones are third party tools that add steps to our post processing workflow. And you have to watch out for "corrections" that are regular when you've got a lens that's irregular (e.g. mustache or wave distortion). Still, most distortion is correctable, though not in most cameras (Panasonic does a lot of linear distortion correction in their cameras, Nikon not so much). The reason why linear distortion falls down the list from CA and vignetting is that when you correct for distortion you often change your composition a bit. Thus, if you're fussy about where you framed your edges (I am) you don't like having that move when you apply distortion correction. Thus, the more distortion in a lens, the more annoyed I get with it, even though I can usually correct it. Basically, you have to frame a bit loose if you're correcting barrel distortion.

Flare isn't really correctable at present. A lot of variables come into play with flare, including the angle and intensity of the light, how many lens elements are involved, the lens design itself, the position of the aperture blades, and coatings on everything from the front lens element to the filter on the camera's sensor. A lot of wide angle lenses produce unusual flare patterns. For instance, the Sigma wide angles tend to produce a small chromatic streak. Several Tokina lenses I've tried lately produce ghosts of the aperture blades. Some Nikkor primes are notorious for creating a faint blue central circle at small apertures. Put another way, flare isn't easily corrected because it comes in so many different forms. The problem is that most of those forms tend to be distracting to the underlying image, so we WANT to remove them. 

Of course, if you don't mind doing some cloning and pixel changing, you can sometimes correct small flares fairly well and quickly, so it's sort of like linear distortion: annoying, but possibly dealt with.

Bokeh requires you to have clear out of focus highlights before it really can be evaluated. Thus, you tend not to talk about the bokeh of most consumer lenses because their mid-range apertures just don't isolate depth of field enough for bokeh to become a critical element in the image. Indeed, almost by definition such lenses have poor bokeh simply because they never produce the level of softness needed for bokeh enthusiasts in the first place. I'd say that bokeh tends to be one of those "all else equal" pick the lens with better bokeh factors.

Which brings us to sharpness. Lack of sharpness can't be easily corrected. Yes, if you know the exact pattern to the acuity of a lens, you could possibly correct for the weaker parts, as the repairs on the Hubble showed (though that was a mirror they were correcting for, not a lens element). But the math here gets complex and we simply don't have any good tools for completely fixing lack of sharpness after the fact at present. So sharpness is mostly uncorrectable. Yes, we can use sharpening techniques to try to compensate for lack of sharpness, but that's not the same thing. You end up distorting contrast at edges, and "sharpened" images look a bit different than images taken by sharp lenses.

So you're probably guessing already how I would select a lens:

  1. Sharpness first. We can't really fix this parameter perfectly in post processing, so it needs to be at the top of our list. Indeed, looking just at three lenses I was reviewing while writing this article (35mm f/1.8G DX, 35mm f/2D, and Sigma 30mm f/1.4) the 35mm f/1.8G DX wins on this attribute alone. It may not produce the highest acuity numbers that any of these lenses managed to achieve at some specific aperture, but it had high MTF values in both center and corner pretty much from the widest aperture down until diffraction started to take over. The Sigma had a few center values that showed higher in the test at mid-range apertures, but at f/1.4 it is soft, as it is in the corners. Sharpness trumps softness in lens selection.
  2. Vignetting. You might be surprised to find this parameter second on my list after my descriptions, above, but there's a reason. Consider that Sigma f/1.4 I just mentioned. It's decidedly soft at f/1.4. It also has very high levels of vignetting. Indeed, enough so that a histogram for an f/1.4 exposure is hard to distinguish from one for f/2. You're not getting a lot of bark with your dog: the center is soft, the edges softer, and the edges are decidedly dark at f/1.4. Yet the f/1.8 Nikkor is sharp in the center, sharp in the corner, and the edges don't darken as much. You're buying the f/1.4 for the extra two-thirds of a stop, but you're only getting a brighter, softer center and a darker, softer corner for your money. On top of all this, as I noted vignetting is sometimes tricky to remove completely. You can easily get rid of much of it in Lightroom, but getting edge-to-edge all at the same value can't usually be done by the in-camera correction, and takes time and energy to get close to right with post processing solutions.
  3. Flare. If you use filters, consider this a double warning: flare is problematic because it will require either avoidance of certain situations or full and time-consuming manual correction after the fact. Some people shoot mostly in conditions where flare won't be an issue, so this parameter falls lower for them. Some people love to shoot into the sun, and this parameter might go higher in the list for them. For me it's one of those "I run into this problem sometimes" things and thus falls sort of mid-range in my selection criteria.
  4. Linear Distortion. A candidate for higher on the list, you'd think, but it falls to #4 because it's easier to perfectly correct than is vignetting and flare—it just takes some post processing capability and time. And it can be fully corrected, where flare is tough to remove completely and vignetting has subtle patterns to it that don't always get corrected fully by current techniques. I generally don't get too worried about distortions in a lens unless I'm comparing two lenses with identical features and near identical performance on #1, #2, and #3. In that case, I'll take the lens with the lesser distortion, because with some scenes that might allow me to not bother to correct for it.
  5. Lateral Chromatic aberration. A lot of people get upset by lateral chromatic aberration, but I'm not sure why. We've got plenty of tools to correct for it these days, and it's easily removed. Nikon JPEG shooters get this correction automatically in the camera these days. Even when present, we tend to measure chromatic aberration in most modern lenses in terms of "pixel spread," and that number is typically right around one much of the time in my measurements of lenses that are producing the aberration. Put another way, an edge may be polluted with lateral color smear in only one pixel each side of the edge (at 12mp; higher resolution cameras would increase this number). There certainly are lenses that produce a greater smear, and I try to identify them, but with Nikkors I've tested recently they've all fallen in a pretty narrow range that's relatively minimal. Easily corrected; and even if not corrected, buried in the pixels for most common print sizes. The things listed above will be seen by your viewer before chromatic aberration is.
  6. Bokeh. Frankly, if you're able to assess differences in bokeh, I suspect that you don't need this list of things to check at all. But I will point out that some of the above factors do play a part in bokeh, especially chromatic aberration, so if you've been selecting on those criteria, by the time we get to this attribute we're starting to split hairs. However, the one thing that can really ruin bokeh is non-symmetrical aperture blades. Few blades (e.g. 5 versus 9) I still find tolerable, as long as the aperture stays symmetrical. 7-blade rounded is better than 9-blade asymmetrical. In short, take a long hard look at the aperture opening stopped down one and two stops if bokeh is important to you.

I haven't written about longitudinal chromatic aberration here. We usually only find this present at a high enough level to worry about in only the very fast primes (f/1.2, f/1.4, maybe f/1.8). Unfortunately, not only is there not a lot you can do to correct it in your images, but you're not going to find two f/1.4 lenses with similar specifications where one has it and the other doesn't. It's one of those "you'll have to live with it" issues most of the time.

Another aspect I haven't called out in the Big 6 but you should be aware of is "coloration." Different glasses used by the various makers tend to make for slight color changes. You'd think that automatic white balancing might help with removing that, but in practice, once you've got a slightly different spectral mix heading towards the sensor, you're going to get a slightly different spectral mix recorded in your shots. It's basically similar to using a hue shift. In my experience, each manufacturer is relatively consistent in their coloration. Going from warmest (yellow) to coldest (magenta) in balance: Sigma, Tamron, Nikon, Tokina. 

Personally, I like consistent color, so I tend to not mix Nikkors with third-party lenses in my kit if the Nikkor is close to the third party lens in other attributes I'm looking for. And Tamron and Nikon tend to be close enough that the difference isn't enough for me to get worried about. Still, this could be a consideration for someone who is shooting images that require absolute color matching (e.g. studio shooters taking product catalog shots). 


Further items based upon feedback from site readers:

  • "You have to evaluate whether or not a parameter even matters to you." Correct. Architectural shooters, for instance, need straight lines. Landscape shooters can often tolerate a fair amount of linear distortion before they need to correct. Both need sharp corners and prefer low vignetting. So some of you may actually find that instead of looking at all six things in my list you really only look at three. But with rare exceptions, my guess is that you'll still use the same order of importance.
  • "What happens when you use protective filters?" Basically all bets are off. Even a high quality filter is not going to perfectly match what's happening downstream in the optics. I've seen everything from small color issues, to increased flare, to lower contrast, to...well, everything. If you're buying a lens using the criteria listed above and then put a filter on it, you simply may be invalidating your decision. That's not to say that there aren't protective filters out there that you could use and not see a degradation in optical qualities with, but you shouldn't assume that you can slap any random filter on a a lens and the lens still performs the same. I wish that were true, but it isn't. As i state in my filter article: you shouldn't use a filter unless its function is necessary to what you want to achieve. For protective filters things change: you also need to test them to see that they don't degrade the optics of the lens you're trying to protect.
  • "At what point does sharpness stop mattering most?" I'd say never with only a small grin on my face. Thing is, if we had said it stopped mattering when we bought a D3, we might have had a different opinion when we upgraded to a D3x. Lenses outlast cameras, so the sharpness of something you buy today had better live up to what you need tomorrow. The higher the resolution of the camera, the more it will reveal the true sharpness pattern of the lens. Center to corner performance differences will become more evident.
  • "Do you need sharpness for portraits?" Yes, but portraits are tricky. You need sharpness for hair and eyes, but you don't want it for skin. I don't know of a lens that accomplishes this. But if you don't have sharp hair and eyes, I do know you can't really post process that back in, while you can take out the sharpness in skin.
  • "How does Unsharp Mask in post processing impact acuity?" We're running the output of a lens through a lot of things that take away acuity: antialiasing filter, Bayer demosaic, and even just analog-to-digital processing. Unsharp masks and other techniques don't put any of that back in. Instead, they play contrast games with the eye, fooling it into thinking it sees a stronger edge. Let's put it into a numbers game to see the difference. Let's just use an arbitrary scale of 1-10 for our tonal values. We're photographing a dark line on a light chart. So in a perfect world we'd get 22228888 as pixel values as we move across from side to side. Unsharp masks make that 22219888. Whereas before we had a clean transition between pixels, we now have a transition that spans four pixels. It just isn't as clean, and it isn't acuity. It does look like a strong edge, though ;~)
  • "If correction of vignetting just moves the tonal values, couldn't this lead to noise and other artifacts?" Yes, it can, which is one of the reasons why vignetting is up higher on my list. If we have to correct a stop-and-a-half to move corner values up to where they should be, shadows in the corners can be problematic after the change.
  • "Doesn't DxO correct sharpness?" Correct is probably too strong a word here. Certainly if you know the blur pattern of a lens you can apply algorithms to compensate for that pattern. As I noted, that's what we did with the Hubble telescope: it can be done. But the more I think about it, the more I think post processing sharpness correction might not be exactly the same as having a "perfect" lens in the first place. It's not just "blur" that we're dealing with when we refer to sharpness, but contrast, as well. I'm willing to stand to be corrected on this, but I don't think DxO quite achieves the same thing as we would have had if the lens didn't have a sharpness defect in the first place.
  • "What about astrophotography?" If you're doing this type of photography you need a couple of other things to be well controlled and they should be very high on your consideration list: lack of coma is one. The old 58mm f/1.2 NOCT is probably the best lens made in that respect. You also need low field curvature, and of course, good sharpness. You're essentially trying to resolve a very tricky test chart, after all: you've got point sources of light next to black. On the flip side, there are things in Nikon DSLRs that get in the way of good astrophotography. Nikon places the black midpoint at zero in raw files (Canon gives you the whole bell curve for black), plus older Nikons do a destructive hot pixel filtering that interferes with point resolution.
  • "Would you say the same thing for a film user?" Good point. Probably not. Chromatic aberration would probably go higher on the list because we don't have any way to remove it from film (unless we scan it and move to the digital side).
Looking for gear-specific information? Check out our other Web sites:
mirrorless: sansmirror.com | general: bythom.com| Z System: zsystemuser.com | film SLR: filmbodies.com

dslrbodies: all text and original images © 2024 Thom Hogan
portions Copyright 1999-2023 Thom Hogan
All Rights Reserved — the contents of this site, including but not limited to its text, illustrations, and concepts, 
may not be utilized, directly or indirectly, to inform, train, or improve any artificial intelligence program or system.