I've long held to a simple design philosophy: customers are terrible product designers.
If you ask a customer what to change on or add to a product, you'll get answers. Those answers may not make for a better product.
Right now I'm seeing a lot of this kind of request from users:
- Give me more dynamic range
- Give me 4K/120P
- Give me 20 fps
Simple specification constructs like that are easy outs. If we've got 11 stops dynamic range, give me 12. If we've got 45mp, give me 60mp. If we've got 4K/30P, give me 4K/60P and then 4K/120P. If we've got 12 fps, give me 20 fps.
Here's the thing: how many real world user problems would such changes actually solve? I'll bet that the answer is near zero.
In the automotive world we've had the same thing: more cylinders, more horsepower, more torque, more acceleration. In a world—at least here in the US—where we travel longish distances in our vehicles but not often at speeds above 55 mph. Frankly, we'd be better served by smaller engines that are well tuned to the way we use our cars, uh, I mean SUVs.
The reason why the automated driving aspects that have come into play recently attract lots of attention is that they do solve real user problems. Blind spot warnings and collision avoidance braking, for instance, are easy to understand in that context. But automating the complex process of driving to make it safer and more enjoyable means you can pay more attention to what the kids are doing in the back seat.
What camera designers need to be asking is whether or not they're solving real user problems any more or not. I'd argue that they mostly aren't. Sure, autofocus systems have gotten more sophisticated and capable, and that does further solve a user problem, particularly if that sophistication comes in the all-automatic modes. But more pixels? No, not really.
Here's the real user problem facing camera users now: "I've taken a photo, now what?"
While I was checking out the new features of iOS 13 (and iPadOS), I finally looked a little closer at Shortcuts, an app that I had overlooked before (and probably rightly so, as it wasn't quite yet ready for everything that could be done). Here's an idea for you: connect your iPhone to your camera via USB cable and say "Hey Siri, Message Photos." What? Say What?
Yeah, it works (after you've built the Shortcut). I'm still fiddling with it, though. Basically you create a Shortcut called Message Photos that consists of (1) Photos/Import; (2) Photos/Select Last Import; (3) Messages/Send To. And voila, all my photos go to my recipient (assuming that they can read Apple Messages). You could do this via email, as well. Yes, you can post to Facebook or Tweet using Shortcuts. You can even use variables like "Ask Each Time", which would allow you to put a comment/text with the image, though batch sending like I've suggested here starts to become a problem if you get too complex.
Here's the problem: SnapBridge doesn't support Shortcuts. There are no actions you can do with SnapBridge. Shortcuts doesn't have an action to apply hashtags to a batch. The ability to build a loop (as in "For Each Photo in Import Do...") isn't there. So there's a limit to what we can do with Shortcuts at the moment with our cameras.
Still, Apple discovered a user problem—stringing together multiple apps to create a one-step process—and is trying to solve it. Are the camera companies doing the same? Doesn't seem like it.
So for now I can only dream about the camera that lets me load some variables into the EXIF data of an image (hashtags, caption, destination, resize choice, priority) while shooting, pushes that over to my phone via something like SnapBridge, and then because that phone app supports Shortcuts, allows me to say "Hey Siri, Output Images with Priority 1." Plus, when I get home be able to say to my computer "Hey Siri, Archive all Images."
So, to answer the headline's question: no, customers aren't asking for too much, they're asking for the wrong thing.