All About Nikon VR

Nikon's VR system explained

Even after being online for some time (almost 10 years), this article continues to generate responses and controversy. Because there are conflicting posts on other sites people tend to disbelieve what I've written. All I say is: do so at your own risk. This article is based upon years of experience with Nikon's VR system, very close analysis of testing results, talks with many other professionals, and even information from Nikon insiders, including engineering staff. At this point, the only thing about Rule #2 that isn't fully understood is the "why." For that we'd need more access to the design engineers.

  1. turn VR off unless it's actually needed.

Yes, this rule flies in the face of what almost everyone in the world seems to do and what Nikon implies with their advertising and marketing. The simple fact is that VR is a solution to a problem, and if you don't have that problem using VR can become a problem of its own.

To understand that, you have to understand how VR works. In the Nikon system, VR is essentially a element group in the lens that is moved to compensate for any detected camera motion. Because this element group is usually deep in the middle of the lens, typically near the aperture opening but not exactly at the opening, you have to think about what is happening to the optical path when VR is active. Are there times when it shifts where it imparts a change to the image quality other than pure stabilization? I believe there are, though the impact is visually subtle. Some of the mid-range distance bokeh of certain VR lenses appears to be impacted by VR being on. Put another way, the background in the scene is moving slightly differently than the focus point in the optical path. This results in what I call "busy bokeh," or bokeh that doesn't have that simple shape and regularity we expect out of the highest quality glass.

Most people using VR don't question the mechanics of the system. They simply believe it's some special form of magic. It's not. Physics are involved, not magic. And one of the physics issues is the sampling and movement frequencies. The sampling frequency of the motion detection mechanism determines what kind and how much movement can be removed. Care to guess what the sampling frequency might be? 1000Hz according to Nikon. That sounds pretty good, doesn't it? Nope. 1000Hz is 1/1000 of a second.  While this sampling frequency is of the camera motion, it is not completely uncorrelated with shutter speed. For example, the shutter curtains travel across the sensor at speeds above 1/250, exposing only a portion of the image at a time. Another aspect of the Nikon VR system is that it “re-centers” the moving element(s) just prior to the shutter opening. Simply put, there's a lot that has to be right at very short shutter speeds in order for there not to be a small visual impact, especially with long lenses.

(After discussing the above with many engineers, both lens and otherwise, I’m led to believe that the resolution of motion detected by the gyros of the VR system is probably more of an issue than the speed at which motion is sampled. That plus the friction and inertia in the VR element gimbal itself are going to be relevant, too. In other words, you might be able to detect a movement as small as 500Hz, but you may not be able to perfectly correct for it.)

But that's not all: when you have VR turned on, your composition isn't going to be exactly what you framed. Yes, the viewfinder shows the VR impact, but Nikon's VR system re-centers the VR elements just prior to the shutter opening if they’ve moved much. This means that you can get slightly different framing than you saw. Note: this seems to be changed in Sport VR mode.

  1. VR should normally be off if your shutter speed is over 1/500.

Indeed, if you go down to the sidelines of a football game and check all those photographers to see how their lens is set, you can tell the ones that are really pros: VR is usually off (unless they're on a portion of the stadium that is vibrating from fan action). Those pros have all encountered the same thing you will some day: if you have a fast enough shutter speed, sometimes the system is running a correction that's not fully in sync with the shutter speed. The results look a bit like the lens being run with the wrong AF Fine Tune: slightly off. Acuity is hurt a bit.

The interesting thing is that pros demanded VR (IS in the case of Canon) in the long lenses, then it turns out that they very rarely use it! I'd say that less than 10% of the shooting I do with my 400mm f/2.8 has VR turned on (and by the way, I hate the rotating VR switch on some of these lenses—it's so easy to not notice what position it is in). A word of advice: some of those previous generation non-VR exotics are relative bargains. Consider it the VR bubble. Some day people will stop paying such silly premiums for VR over non-VR. At least they should. I know of several photographers who blew US$3000+ making the switch from non-VR to VR versions of lenses. That's too much of a premium for the benefit, I think. Especially considering many of those photographers were using monopods or tripods with gimbal heads!

Anecdotal evidence continues to pile up about VR and high shutter speeds. In hundreds of cases I've examined now the results are the same: the lens seems to have more acuity with VR off above 1/500. That's my own experience, as well. A small handful of people have presented with me with evidence of the opposite (VR improves their results above 1/500). In most of those cases I've been able to find that it's not VR itself that's helping remove camera motion, but that their handholding or tripod technique is such that they're not getting consistent autofocus without VR, but they are with it. My contention is that they'd see even more improvement by dealing with the handling and focus consistency issue and turning VR back off above 1/500.

However, as with virtually everything in photography, there's a caveat to the above. For instance, what if you're sitting in a helicopter shooting at 1/1000, should you use VR? One of the things that Nikon just doesn't explain well enough is the concept of "moving camera" versus "camera on moving platform." If the source of motion is your holding of the camera, then what I wrote above about turning VR off above 1/500 is probably true. It’s absolutely true for semi-steady situations, such as shooting off a monopod or on a tripod with a gimbal head. However, if there's a platform underneath you causing vibrations (car, boat, train, plane, helicopter, etc.), things are a bit different. This is what Active VR versus Normal VR is all about, by the way. Active VR should be used when you're on one of those moving platforms. Normal VR should be used when you're on solid ground and it's just you that's shaking. Basically, if you're vibrating due to outside source, Active VR should be On. If you're the only source of camera movement, then use Normal.

  1. If something is moving you, use Active. If it's just you moving the camera, use Normal.

The difference between Active and Normal has to do with the types of movements that are expected and which VR will attempt to correct. Platform vibrations tend to be frequent, persistent, and random in size and/or direction. Handholding motion tends to be slower and move in predictable paths (e.g., when you press the shutter release hard the right side of the camera moves downward—it's Newton's Law, not mine). Knowing which type of motion the VR needs to deal with lets the system optimize its response.

So, getting back to our 1/1000 example while shooting on a helicopter, we have a conflict. The motion that you impart by your handholding may not get corrected right by the VR system because your shutter speed is faster than the frequency with which corrections are well managed. But the platform you're sitting on is imparting small, frequent, and random motions that might actually be corrected (but probably not fully) by having VR on. The question here is whether the improvements due to removing some of the platform movement are better than the possible degradation due to the shutter closing faster than the VR is working. There's no clear answer to that, as every situation is going to be a little different, but my tendency is to experiment with Active VR being On versus VR being totally off when shooting from moving platforms at high shutter speeds. I closely examine my initial results, and make my final decision based upon that. Of course, that in and of itself can be a problem for some, as examining a small screen in a moving vehicle isn't exactly easy and precise. Still, I sometimes see an improvement with VR as opposed to without it when I'm shooting at high shutter speeds from a vehicle. At the same time, that's not as much improvement as you'd see using a dedicated gyroscope instead of VR. If you regularly shoot out of helicopters, a gyro is a better investment than a more expensive VR lens.

Aside: There are a number of photographers that say that using VR above 1/250 (or the flash sync speed, if slower) should be avoided. Some explain that shutter speeds above that are done by moving an opening across the image rather than having the full image exposed simultaneously (this is a simplification, but it's good enough for this discussion). Thus, VR corrections done on shutter speeds above 1/250 may be correcting only a portion of the image at a time. In practice, I believe I can sometimes see very small changes at 1/500 shutter speeds versus 1/250 when VR is On. Not enough change, however, for me to alter my 1/500 limit. Above 1/500 I can much more clearly see visual changes, and changes that I don't like in my images (e.g. loss of acuity).

At the other end of the movement spectrum, we have subject motion. If the subject is moving, using VR with longer shutter speeds can be problematic. I've seen people use 1/15 with VR on for moving subjects. Well, even a slow-walking human has enough movement in 1/15 to cause edge blur.

  1. If your subject is moving, you still need a shutter speed that will stop that movement.

This is a tough thing to learn, and it's usually learned the hard way. Because camera makers essentially tout VR by making assertions like "allows a four-stop improvement over hand holding," users start thinking like this: "if I can handhold my 100mm lens at 1/100, then VR would allow me to hand hold it at 1/6." Well, maybe. But the only motion being removed by the system is camera motion. If your subject moves during that 1/6, it's still going to produce subject blur. Looking back at my Nikon Field Guide (page 51 for those of you following along), we get 1/125 for the minimum shutter speed necessary to freeze a person walking across the frame (1/30 if they're walking towards you). This is, of course, a generalization. There's a more detailed table below the one I just referenced that shows how distance impacts the shutter speed, too. Plus the size of the subject in the overall frame makes a difference. Expecting VR to remove all motion including subject motion is something everyone has to get over:

  1. VR doesn't remove all motion, it only removes camera motion.

Another type of motion comes with panning the camera, and VR has impacts there, too. I've seen people say that they think you should turn VR off when you pan with a subject. There may be times when that's true, but my experience is that VR should be on while panning, and that’s especially true of lenses that have the new Sport mode (the lens should be in Sport mode during the pan). That's because the Nikon VR system is very good about detecting a constant camera movement. If you're doing a smooth pan in one direction, the VR system will focus on removing only motion on the opposite axis. That's the way it's designed to operate. The trick is to make sure that your pan is relatively smooth, and not jerky. Most people start to jerk when they press the shutter release during pans. You need to practice NOT doing that and to continue the pan while the shutter is open, not stopping. 

Indeed, try practicing this at your local track (or other place with some runners present). Pan with the runner and take a picture. When the mirror returned and the viewfinder view is restored after the shot is the runner still in the same spot in the frame? No? Then you didn't continue panning through the shot. Tsk tsk. Try again. Practice until you can take a series of shots and the runner stays in the same spot through the entire sequence, both in the shots and while you're panning between shots. You shouldn't be having to catch up to the runner.

Aside: Back in high school my photography mentor at the time broke me of the habit of stopping during pans in a brutally sadistic way: he sent me to track meets with a TLR (twin lens reflex). You look down into the viewfinder of a TLR. But here's the thing: left to right is reversed. So if the subject is moving right to left in front of you, they appear left to right in the viewfinder. You don't have a chance of following motion with a TLR unless you can relax your brain and make it just mimic the motion of your subject in your own body's motion. You can't look and react, look and react.

  1. If you're panning correctly, VR should probably be On.

Yet another aspect of VR that confuses people is activation. Nikon's manuals don't make this very clear, but it really is quite simple with older Nikon DSLRs: only the shutter release activates the VR system (D4/D800 and all later cameras are exceptions: Nikon changed VR so that it is also activated by the AF On button). 

A partial press of the shutter release always engages VR and allows it to begin a sequence of corrections. A quick full press of the shutter release also engages VR but it really doesn't get much in the way of data samples to rely upon and make predictions from (on the pro bodies we're talking 1/33 of a second or so between the press and the shot being taken). Basically, if you engage VR prior to the shot, you tend to get slightly better and more consistent results. That doesn't mean you should always wait for VR to engage before fully pressing the shutter release. If it's time to take the picture, take the picture! VR will give you its best shot at fixing your camera motion when you just punch the shutter release. But there are two factors that tend to make early VR engagement a better choice if you can do it: first, the VR system gets a stream of data it can predict from; and second, it's difficult to move the camera as much by jabbing the release if you've already partially pressed the release!

The usual issue that comes up with the last paragraph is the line in many Nikon's manuals about "VR doesn't function when the AF-ON button is pressed." This is one of those places where the translation in Nikon's manuals is out and out misleading. If the line had read "VR doesn't engage when the AF-ON button is pressed" it would be more correct, at least for older Nikon DSLRs. In my testing VR is not "turned off" by using the AF-ON button on any Nikon body, it simply isn't engaged by that button press on pre-D4/D800 bodies. Only the shutter release button engages VR on earlier cameras such as the D3 and D300. Thus, if you use AF-ON to focus instead of a partial shutter release, VR is not engaged during the pre-shot focusing for those cameras. But it is during the shot. (Again, D4/D800 and later cameras changed this so that AF-ON does activate VR. See note at very bottom of article.)

This, of course, creates a slight issue. Optimally, we want VR to have a stream of data just prior to pressing the shutter release fully, and we want the viewfinder image stabilized so that we’re framing as well as possible. If we're using AF-ON to focus, our fingers usually aren't pushing the shutter release partially down, too. But you should practice doing just that if you own an older Nikon DSLR. Sigh. That right hand is starting to do a pretty complicated dance: AF-ON up and down for focus, shutter release partially down for VR, right thumb dialing in shutter or aperture or exposure adjustments, maybe right middle finger dialing in aperture adjustments, shutter release fully down with the index finger for the shot. This, by the way, is one of the reasons why I prefer Nikon's ergonomics to Canon's: at least when I'm doing all that hand juggling, my hand and finger positions aren't really moving, especially my shutter release finger. With Canon the tendency is to move the index finger between the top control wheel and shutter release. You can react with the shutter release faster if you're not moving that finger.

There are a few more caveats. If you've got a built-in flash on your camera (basically everything but the D1, D2, D3, and D4 series), while the flash is recharging the VR system is inactive. That's because VR takes power to perform and the assumption is that you want the flash recharged as fast as possible. Thus, the camera turns off the power to the VR system while it's charging up the camera's flash capacitor. If you're shooting flash near full power and doing a lot of consecutive flashes, the flash recharge time can start taking a few seconds. How do you know if power is restored to the VR system? Well, you can't, exactly, but the flash indicator in the viewfinder is a fairly reliable indicator: if it's not present with the flash up and active, VR is probably Off.

  1. If you rely upon VR and use flash, use an external flash instead of the internal one if you can.

I've been holding off on the tripod issue to the end of this article, partly because it's not as clear cut as Nikon seems to think it is. But by now you've probably turned VR off, anyway ;~). Part of the problem is that Nikon hasn't clearly labeled and described their various VR system iterations. Technically, the VR II system on some of the modern lenses should detect when the camera is on a stable platform and not try to jump in and correct anything. But not all modern lenses have what most of us regard as the full VR II. The recently introduced 16-35mm, for example, comes long after the intro of VR II, but it does not appear to have tripod recognition. Thus, we have another rule before we get to the real rule:

  1. You MUST read your lens manual and see what it says about use on tripods.

Two basic possibilities exist:

  • The manual says turn VR off when on a tripod (sometimes adding "unless the head is unsecured")
  • The manual specifically says that the VR system detects when the camera is on a tripod

Okay, I lied. Forget what the manual says.

Rule #8 For Real: If your camera is on a tripod, even if you're using something like a Wimberley head where it is almost always a bit loose, turn VR off. If your tripod is on a moving platform or one that has vibrations in it, strongly consider turning VR on, but test it to be sure you need it and you aren’t adding VR impacts to edge acuity.

So why do I disagree with Nikon? Even with a loose head on a tripod, motion should be fairly easy to control, and you should have removed one possible motion almost completely (ditto with monopods). The problem I have, and which many other pros have noticed, is that the VR tripod detection system sometimes produces "false negatives." The tripod detection mode of the VR II system should be detecting when the system is "quiet enough" to turn off corrections. Most of the time it does just that (Nikon says that the system is smart enough to detect as many as three different types of motion—handholding, platform vibration, and support system movement—because the "vibrations" caused by each of these are recognizably different in wave form). Every now and then, though, VR thinks it needs to correct when it doesn't (or perhaps is still correcting for a previously detected motion that will no longer be present in the next sampling). When that happens, the VR element(s) are moving when they shouldn't be. Usually not a lot, but enough to make for less than optimal results. Result: loss of edge acuity, possibly busier bokeh.

Indeed, this is the very same problem as with using VR over 1/500: sometimes it works fine, sometimes is doesn’t and causes edge acuity loss. The problem is that you won't like it when it doesn’t work right, and you won't know when it does until you examine your images closely in post processing. If I were to tell you that out of 100 shots you take 10 were going to be “poor" due to the VR doing the wrong thing, would you still use VR? Remember, when you're on a tripod, all 100 shots should be good without VR (otherwise you have the wrong tripod and head, see this article, or you're using poor technique). I'm not a gambler: I prefer the known to the unknown, so I don't like having random shots slightly spoiled by VR.

Which brings up a whole different topic: what does a spoiled-by-VR shot look like? Well, "spoiled" is perhaps too harsh a term. Sub-optimal is probably a better one. An optimal shot has very clean and well defined edge acuity. Assuming a "perfect lens," edges should be recorded basically as good as the anti-aliasing filter, sensor, and Bayer demosaic allow. What a lot of us find when VR is not quite correcting as well as it can/should is that edges get a little bit of "growth" to them, and sometimes there's a directionality to that growth. It's sort of like camera movement, only much more subtle. I tend to say that the detail "looks busy" when VR isn't fully doing its job or is on when it shouldn't be. And when you apply sharpening to busy edges, that busy-ness gets busier. Without VR active at all while on a stable tripod, it's like a veil gets lifted and you suddenly see how sharp your lens really is (assuming you correctly obtained focus on your subject and had a stable platform, that is ;~).

Yes, there's some nitpicking going on here. VR not correcting right is a bit like tripod mount slop (fixed with a Really Right Stuff Long Lens Support) or ringing vibrations in the tripod legs (fixed by using the right legs for your equipment): you don't see it until it's gone, and even then usually only if you're pixel peeping. But someone using a 400mm f/2.8G VR lens on a D810 spent a lot of money on equipment to get the best results. They expect to be able to catch every bit of detail and blow it up into a large print. As always on this site, you need to understand that I always write about the search for optimal bits. If you're shooting with a 16-85mm on a D300 and putting 640x480 images on the Web from that, well, whether the VR missed doing its job by a little bit probably isn't so important.

9. Turn VR off and turn the camera off before removing the lens from the camera

Nikon’s manuals warn about turning the camera off before removing a VR lens from the camera. If the VR system was “active” in any way when you do this, you’ll find that the lens spasms after it has been removed from the camera. This isn’t good for the VR system. So always turn the camera off before removing a lens.

More recently, there’s been speculation that turning VR off helps protect the VR system from harm during transport. While we have no evidence of that being true, it doesn’t hurt to turn VR off on the lens, then turn the camera off, then remove the lens. That’s best practices, and you should follow them.

If there are more questions on VR I'll address them in the Discussion at the bottom of this page. Until then, here's your motto: VR stays off unless I specifically need it. VSOUISNI and prosper.


  • "Which lenses are VR I and which VR II, and what's the difference?" The difference is vague, as Nikon hasn't really released enough information to say much more than VR I claimed to give a three-stop advantage while VR II claims a four-stop advantage. Yes, in practice, the new VR seems to do a slightly better job, but it's unclear as to why it does a better job. Lenses that are still VR I include: 18-55mm, 18-105mm, 24-120mm, 55-200mm, 80-400mm, and the 200mm f/2. Lenses that are known to be “VR II” include 16-35mm f/4, 16-80mm DX, 16-85mm DX, 16-80mm DX, 18-200mm DX II, 70-200mm II, 70-200mm f/4, 200-400mm II, 200-500mm f/5.6, 300mm II, 300mm f4E, 400mm II and E, 500mm II and E, 600mm II and E, 800mm E.
  • "Does VR make a lens more likely to fail and need repair?" Possibly. I've had one VR failure that needed repair and I know of others who've had similar failures. Still, it's rare that a lens has a mechanical failure, though adding the complexity of the VR mechanics certainly must increase the likelihood of encountering a problem.
  • "My VR is on occasion very jumpy." Check your camera's battery level when that happens. I'll bet that it is low. When you run batteries way down and activate VR it appears that the VR system can sometimes demand more power than the camera can supply instantaneously. The result is "jumpy VR" as the VR circuitry cuts in and out. I consider it just another "low battery" warning ;~). But see the "jumps after a shot" comment, below.
  • "Don't you get some effect from VR even if your shutter speed is above 1/500? After all, the VR elements are probably moving between samples." Yes, sometimes you get a VR-like effect above 1/500, and it's probably because the elements are in near constant motion and the designers have picked a movement frequency and smoothing curve that takes advantage of the known sampling frequency of the motion detected. But the problem with using VR above 1/500 is that you will get clear image degradation often enough that you'll get burned by it. And I believe you get burned by it more often than you'd get burned by having VR off. Moreover, I don't know of a working sports or wildlife pro using the long lenses that hasn't discovered the same thing by practice: VR tends to degrade edge acuity above 1/500.
  • "Does VR stabilize the autofocus system?" Yes. And this can be important in a few instances. It's one of the reasons why I argued that not putting VR into the original 24-70mm lens was one of Nikon's bigger mistakes in the first digital decade. If you're moving the camera enough that the autofocus sensor(s) you're using isn't staying stable on the point you want focused, there's a chance focus will shift to someplace you don't want it. Your Lock-On and other autofocus settings interact here, so it's not a 100% certainty that VR will improve your autofocus results, but it does just enough that I find it useful to have the option. At wide angles, the AF sensors can easily get distracted by backgrounds. Nikon vaguely warns about this in their manuals (fifth example, D700 manual page 80). So if you're moving the camera enough that the background is getting onto that autofocus sensor with regularity, that can be a problem, and VR might help.
  • "The viewfinder jumps after a shot." This is normal. Note that the Nikon VR system operates differently for pre-release focusing: the viewfinder image is stabilized, which means the VR elements may eventually move off center to provide a stable view (also impacts focusing, see question just above). But during the exposure the system does a few different things. First, it re-centers the VR elements. Second, it uses a different algorithm for doing its correcting. It very well may be the re-centering action that causes some of the above 1/500 issues, by the way.
  • "What about monopods or beanbags?" Nikon tends to recommend having VR on with monopods in most of their manuals. Personally, I think this really gets down to a handling issue, though. One of the primary camera motions that VR is often correcting is the "shutter release stab," which tends to impart a forward or backwards tilt in the camera. Proper use of a monopod tends to (mostly) remove that component, leaving side-to-side as the primary camera movement needing correction. So it starts to depend upon what's causing that side-to-side motion. Following action that moves in one direction? That's panning (see above). Following action that moves back and forth? Be careful of the shutter speed. At low shutter speeds (which would need VR) subject motion is going to be your biggest issue. At high shutter speeds, you're turning VR off anyway. This gets back to my "VR should be off unless needed" rule: there's actually a very narrow window of shutter speeds that make sense to have VR on with moving subjects, perhaps as narrow as 1/125 to 1/500. Subject isn't moving but you can't hold the camera still on the monopod? This would be a case where VR probably should be on.
  • "Do sensor-based stabilization systems have the same rules?" This article is specifically about Nikon VR. It also seems to apply fairly well for Canon IS, which is a very similar system. I can't say that the sensor-based systems do or don't act the same. I have a suspicion that they do, which means that burying the on/off for it in menus is the wrong approach for optimal results. That's because it encourages people to just leave it on all the time. Nevertheless, I don't know enough and haven't tested enough to know for sure. [A lot of us using the Olympus system of stabilization on their m4/3 sensors have noticed anomalies at times. The early versions had clear issues with shutter speeds in the 1/50 to 1/100 range under certain circumstances. This seems to have been improved with more recent cameras, but Olympus eventually created another “mode” to deal with initial shutter shock in relationship to the IS. In using Olympus IS on long lenses with wildlife, I've discovered the same pattern of sometimes getting shots that aren't up to snuff when they should be (short shutter speed and IS turned on). It's unclear what it causing this, and when it happens it's more pronounced than what I describe in the Nikon VR system, above. This seems especially true with the Panasonic 100-300mm on the Olympus bodies, so it could be a slight engineering difference.]
  • “VR and focus interact at macro distances.” Nikon themselves suggests to turn VR off when using the 105mm Micro-Nikkor for macro work. I’d tend to include “turn off AF-S”, too. Unless you have an absolutely rock solid camera mount (in which case you don’t need VR) and a steady subject distance, at macro distances you tend to get systems fighting one another slightly and both making constant adjustments.

Something that came with many of the email questions I’ve gotten from this article is more basic than the specific questions. It appears most people just want to be told "use it for X, don't use it for Y." While I can broadly suggest probable use patterns, VR is just another one of those decisions that photographers have to make when evaluating each situation they encounter. Taking shortcuts with decisions ultimately leads to less-than-optimal results. For casual shooting, shortcuts perhaps work just fine for most people, and I've suggested a bunch in this article. But for serious shooting where quality matters, a good photographer is always evaluating, always testing. In some ways, digital is great for that, as we have an immediate feedback loop and can test a setting assumption almost immediately, plus we have the ultimate loupe in our large computer monitors.

Thus, one other point I'll make is that I can't tell you every possible time you need to use VR and every possible time you shouldn't. What I do know is that when VR has been on when it shouldn't be, my images suffer. And yes, when I shoot without VR on when it should be, my images suffer, too. However, generally I know when I'm imparting substantive motion to the camera during shooting. Thus, VR is off unless I know that I'm imparting motion, and then I only turn it on if I can guess—and verify with a field test—that it will remove that motion.

One thing I've noticed is that those of us who shot with long lenses back in the film days prior to VR aren't quite so fast to turn it on as someone picking up a camera today. Part of that is the marketing message ("up to four stops better!"), but the real reason why the old-timers tend to use VR only on occasion and mostly correctly is that we already had to figure out when we were imparting camera movement prior to VR being available. We either had to correct the underlying problem or not shoot. Thus, we tend to know when we're on the margin where VR might be helpful. I'd argue that leaving VR on and turning it off only when you see a degradation (which may be too late if you're seeing it when you get home and looking at images on your monitor) isn't easily learned. Leaving VR off and turning it on only when you see a degradation is much more easily learned.

Sports Mode
Yeah, what’s that all about? Where did Active Mode go? 

Active Mode has to do with looking at vibrations that are human caused (e.g. poor handholding) versus platform caused (e.g. vibrating, moving vehicle). Basically, if you’re moving, you should probably be in Active Mode. If it’s just the camera that is moving due to your handling, you should be in Normal Mode.

However, Active Mode disappeared on a few lenses, and something called Sport Mode appeared. Sport Mode is very different.

First, lenses that have Sport Mode:

  • 200-500mm f/5.6E
  • 300mm f/4E
  • 400mm f/2.8E
  • 500mm f/4E
  • 600mm f/4E

Sport Mode tries to keep the viewfinder stable when shooting moving subjects. Simple as that. Normal Mode on these lenses tries to keep the system absolutely in place, which can make for jerkier viewfinders with subjects in motion as you start to pan or tilt, as the VR system will first try to correct for the movement before realizing that you’re actually panning or tilting. 

Nikon’s advice: Normal for stationary subjects, Sport for moving subjects.

VR Image Flare

Some of Nikon’s lenses use positioning mechanisms that rely upon a small, low-power LED to establish position, and that LED isn't well shielded from the optical path. These lenses are known to produce visual artifacts—Nikon calls it fogging—if you’re shooting at really high ISO values or with really long exposures. It doesn’t matter if VR is on or off on these lenses, the positioning mechanism can still mar the image data. The lenses that do this are:

  • 16-35mm f/4G
  • 16-85mm f/3.5-5.6G DX
  • 18-200mm f/3.5-5.6G DX (both versions)
  • 24-120mm (both versions, f/3.5-5.6G and f/4G)
  • 28-300mm f/3.5-5.6G
  • 55-300mm f/4.5-5.6G DX
  • 70-200mm f/2.8G (older version)
  • 70-300mm f/4.5-5.6G (all versions)
  • 85mm f/3.5G Micro-Nikkor DX
  • 200mm f/2G
  • 300mm f/2.8G
  • 400mm f/2.8G
  • 500mm f/4G

Other Notes

Nikon eventually got around to documenting why they believe a lens-based VR system is better than sensor-based stabilization. They made four points:

  1. The finder image is corrected. This is absolutely true for pure optical-only viewfinder systems, though most of the cameras these days that have sensor-based stabilization use either EVF or the color LCD for composing. Over time, this advantage is disappearing.
  2. VR can be tuned to the lens. This is true in one sense, in that the VR system can be tuned to the optical design to get maximum correction. In a practical sense, though, it doesn’t show up as a big deal in use. Still, there is a theoretical advantage that should get larger with longer focal lengths. Olympus decided to add IS to their longer lenses, even though their cameras had sensor-based IS. They added IS partly for this reason.
  3. The AF system is getting stabilized information. This is like the first point and thus the truth is basically the same: absolutely true for pure optical-only viewfinder systems, but not always true for contrast-based EVF type systems, though the advantage will probably erode over time.
  4. Blur patterns are not the same with all lenses. This is a relative to the second point, again pointing out that VR is theoretically better corrected by knowing the optical properties. Like the second point, in practical use we’re not seeing much real difference, though, except with long focal length lenses.

A more truthful and useful discussion of differences would include these points, though:

  • Sensor-based correction increases heat generation at the sensor, which potentially means more noise. It also adds a mechanical part at the most expensive component of the digital camera system (the sensor).
  • In-lens systems add cost and some bulk to a lens. Sensor-based systems add some bulk at the sensor, which is a factor in how small you can make a camera using it.
  • In-lens systems tend to have easy switches to turn them on and off, while sensor-based systems tend to use menu items to turn the function on and off. 
  • Sensor-based systems generally have warnings against the user manually cleaning the sensor, as they might damage the system.

Nikon has now published the results of the standard CIPA test for all current Nikon VR lenses. This standardized test tries to give you a sense of how well the stabilization systems work in identical circumstances. Here are the current numbers:

  • VR 2 stops better than no VR: 80-400mm f/4.5-5.6D (old)
  • VR 2.5 stops better than no VR: 16-35mm f/4G, 70-300mm f4.5-5.6G, 600mm f/4G
  • VR 3 stops better than no VR: 18-55mm f/3.5-5.6G DX, 55-200mm f/4-5.6G DX, 55-300mm f/4.5-5.6G DX, 85mm f/3.5G Micro-Nikkor DX, 105mm f/2.8 Micro-Nikkor, 200mm f/2G II, 200-400mm f/4G, 300mm f/2.8G II, 400mm f/2.8G, 500mm f/4G
  • VR 3.5 stops better than no VR: 16-85mm f/3.5-5.6G DX, 18-105mm f/3.5-5.6G DX, 18-200mm f/3.5-5.6G DX, 18-300mm f/3.5-5.6G DX, 24-120mm f/4G, 28-300mm f/3.5-5.6G, 70-200mm f/2.8G II
  • VR 4 stops better than no VR: 16-80mm f/2.8-4E DX, 18-140mm f/3.5-5.6G DX, 24-70mm f/2.8E, 24-85mm f/3.5-4.5G, 70-200mm f/4G, 70-300mm f/4.5-6.3G DX, 80-400mm f/4.5-5.6G (new), 300mm f/4E, 400mm f/2.8E, 500mm f/4E, 600mm f/4E, 800mm f/5.6E with 1.25x converter
  • VR 4.5 stops better than no VR: 200-500mm f/5.6E, 800mm f/5.6E

Final note: Some Nikon's manuals still say that the AF-ON button does not activate VR. This is no longer true on the D7100, D7200, D500, D600, D610, D750, D800/D800E, D810, D4, D4s, and D5, but even on the older cameras most people mis-construe what Nikon meant (see above). What Nikon meant is that VR does not start unless you've partially pressed (or fully pressed) the shutter release. In other words, you could focus using AF-ON and VR wouldn't be active during focusing. But when you went to take the shot pressing the shutter release would activate VR at the last possible moment. Enough of us complained that VR actually helps during focus that Nikon made the change to the current high-end cameras that activates VR even if you're only pressing the AF-ON button. Previous to that, we had to do the two button shuffle: half press the shutter release to get VR working, then press the AF-ON button to get a stabilized focus action. 

text and images © 2017 Thom Hogan
portions Copyright 1999-2016 Thom Hogan-- All Rights Reserved
Follow us on Twitter: @bythom, hashtags #bythom, #dslrbodies