The recent photokina show was part photo show, part educational forum, part carnival and part tech fest for photographers. Along with all the new advances in focusing and viewing systems, cameras and in-camera processors came a new lingo—much of it the product of some copy writer’s feverish imagination, but nevertheless in tech terms that help codify the new advances that you’ll see coming soon.
I have never been a fan or convert to electronic viewfinders (EVF), the kind found in all mirrorless cameras. I never saw the image as sharp enough, it was subject to blooming (flashes of whiteout when you move around the scene and hit a highlight) and, worst of all, it had confounding lag and smearing when you changed framing or, heaven forbid, had something active taking place in front of you.
The reason for all this, of course, was that the EVF showed a “signal” of the image and not a reflection (a’ la DSLRs), and that at heart the resolution was low and the processor in the camera couldn’t keep up with the stream of information it was being asked to handle. It all led up to a WYSINWYG imaging experience (what you see is not what you get).
Well, at least we now have a euphemism for this condition: “image latency.” The reason we have a term for this now is that many camera makers have finally admitted to it and gone for the cure—faster image processing via next-generation processors and much higher resolution screens. The RGB dot number (resolution) has risen to as much as 2 MP in some cases, and there’s now also a measurable stat for the lag, which in some cameras like the Samsung NX1 read out as a 5-millisecond “latency”. Will testers now include this in their interminable stats and will we see it in camera company tech specs on their web sites? Perhaps, but at least those EVFs promise to be more tolerable as ways to compose an image.
Electronic Rangefinder Focusing
In the days of old rangefinder, focusing presented two images superimposed onto one another in the viewfinder. When you wanted to focus you looked in to the finder and turned the focusing ring to make them one. In fact, this was a very precise way to focus. Admittedly, it took a bit of getting used to, but a generation or more of great photographers used it (think Bresson, Capa, Eisenstadt) and got very sharp shots, all without autofocus or focus peaking!
The new Fujifilm X100T offers a version of this through the camera’s large eyepiece finder, but it works differently from rangefinders in the past by splitting the image electronically and moving a portion of the focused area into the lower right-hand portion of the frame. You then manually focus the lens to get that view sharp. For me, it beats most manual focus setups in almost all mirror-less cameras.
Take a look at the back end of a lens and you’ll see a bunch of points that mesh with the camera body. These are electronic connections that carry signals back and forth between the camera processor and the lens itself. It’s how you can change aperture without touching the lens, get distance settings and get to use autofocus, and, with Program exposure mode, have the processor tell you when your shutter speed and the focal length of the lens mounted might cause camera shake.
New developments at the show included new handshakes between the lens and the image processor, much faster and more coverage for AF, and probably most interesting, the enhancement of what is called predictive focus—the ability to track a moving subject and have focus retained as it moves through the frame.
The phraseology around all this was varied but added up to the same thing. Sony’s “4D Focus System,” is said to provide “constant focus throughout space and time”. Samsung’s 3D AF in their new NX1 has 153 cross type sensors, 205 phase and 209 contrast detection arrays, which the company tells us offers predictive phase detection AF throughout the entire imaging area. In other words, if it’s in the viewfinder the camera’s AF can grab it.
The Samsung NX1 is a new breed of camera that offers “complete” AF control throughout the entire viewfinder, thanks to numerous AF sensors.
Acquisition time refers to the speed with which the system can activate and complete autofocusing, and while in recent years this time had dropped exponentially, acquisition is now becoming faster. Fujifilm claims a 0.06 second acquisition time in their X30, and Samsung’s NX1 reports a fleeting 0.05 seconds. To boost their claim, Samsung has added an “Auto Sport” mode that they say will be able to track a pitched ball and a swing of a bat so the camera can grab the moment of impact (should the batter connect and both oncoming ball and batter are within the frame).
Lens “profiling” is a terms used to define possible “defects,” such as the presence of chromatic aberration (fringing of colors due to the different wavelengths of color light that aren’t focusing on the same plane). More expensive lenses may have “achromatic elements” to help reduce this, but it may still show up. Software like Photoshop has built-in profiles that can process this away, to an extent, but now Samsung has added what they call OIC (optical inverse correction) into their lenses, which is essentially a mounted lens’s profile being fed automatically to the image processor in the camera. This process-corrects any potential aberrations, etc. in-camera, which seems to eliminate the need to do this in post-processing.
In the early, early days of digital imaging, every shot was essentially a frame grab, but now it’s back to the future with the newly touted “4K Photo” wherein you can, as told at Panasonic’s press event, pause at the perfect moment during your 4K video playback and make a still frame grab. It’s suggested that you shoot at 1/8000 second so you don’t miss a millisecond of the action and then take the time to search the playback for your personal decisive moment.
The resultant still image is 8.3MP (24MB for printing fans), but wait…there’s more. When 8K video shows up (around 2020, says the company), you’ll be able to grab a 33MP (100MB) frame.
So, with more cameras now delivering 12 frames per second, and the 4K image grab deal, it seems all you need do is point the camera in the general direction of what’s in front of you and then sort it all out later to figure out what the essence of the moment was, or is now. To me, this misses most of the point, and lots of the fun, of photography, but it’s there if you want it.
And finally, you might want to check out the height of the air rights on your deed. A whole slew of new camera-carrying lightweight choppers and drone type vehicles are coming on the market, and remote control flight has and will send them to places previously unimagined to make images like we’ve never seen before.
- A new class of lightweight remote controlled “photocopters” open a new way of making videos and stills. Note the tiny camera on board. © George Schaub
The videos and stills on display at the show were trippy and dream-like in some instances, although the commercial uses seem to be mainly shots of Yuppies cruising down Highway 101 in a convertible, albeit from a fascinating and never before seen point of view. Despite that, these lightweights will open up a new class of imaging and a new aspect of the trade for adventuresome photographers. And all you moms who thought that junior was wasting his life away by spending all his time in the basement playing video games will find that he now has a highly marketable talent.