I’ve always been fascinated by anamorphic lenses, which optically compress, or 'squeeze', an image in the horizontal dimension, making it possible to capture an artificially wide field of view on a standard film frame or sensor.
I first discovered anamorphics in college, not because I shot with them but because I had a part time job as a projectionist at a small theater. Sometimes films came through in anamorphic format and I had to attach accessory lenses to the projector to desqueeze the image beamed up on the screen.
Fast forward a number of years. I’m still fascinated by anamorphic lenses, only now they’re becoming accessible enough to content creators that you don’t need to be a Hollywood filmmaker to afford them. One of these days I’ll get around to shooting an entire video project with anamorphics, but recently I’ve been intrigued by the possibility of using anamorphic lenses for still photography.
$(document). ready(function() { SampleGalleryV2({"containerId":"embeddedSampleGallery_5346513962","galleryId":"5346513962","isEmbeddedWidget":true,"selectedImageIndex":0,"isMobile":false}) });
Click the large image above to view the full sample gallery.
Which is why, on a recent trip to Washington, DC, I found myself carrying no camera gear except for my iPhone 11 Pro and two small anamorphic accessory lenses. I'd been in a creative rut for a while and needed a diversion, so I resolved to shoot in anamorphic for the entire trip. It turned out to be a fun creative challenge.
Shooting anamorphic on a smartphone
The two lenses I used for this little experiment were the Moment anamorphic lens ($150) and the Moondog Labs anamorphic lens (also $150), each of which compresses the horizontal dimension by a factor of 1. 33x. Both employ a simple twist-lock M-series bayonet mount (not to be confused with Leica M-mount) and attach to compatible cases from a number of manufacturers including Moment, RhinoShield and Sirui.
The Moondog Labs (L) and Moment (R) anamorphic lenses. Both squeeze the image horizontally by a factor of 1. 33x, which is what makes the exit pupils appear oval in this image.
These lenses are primarily aimed at video shooters. When used with standard 16:9 format video they deliver a desqueezed aspect ratio of 2. 35:1, about the same as CinemaScope, a widescreen cinema format originally developed in the 1950s.
Shooting still photos, however, requires some creative choices. The native aspect ratio for photos on most smartphones is 4:3, so a 1. 33x desqueeze works out to an aspect ratio of almost exactly 16:9.
Of course, if you can already shoot in 16:9, why bother? Because anamorphic lenses provide a qualitatively differently look than simply cropping the frame. You're effectively using a longer focal length but capturing the horizontal field of view of a shorter focal length, giving you more control over depth of field than you would typically have at that shorter focal length. Additionally, anamorphic lenses produce some distinctive optical effects, such as oval bokeh and horizontal lens flare.
Anamorphic lenses provide a qualitatively differently look than simply cropping the frame.
Of course, when working with a smartphone you would need to be pretty close to your subject to have any appreciable control over depth of field or to generate much bokeh, but there's certainly the opportunity to create horizontal lens flare.
In the end I settled on a hybrid approach: I set my phone to shoot 16:9 in combination with a 1. 33x anamorphic lens. This results in that wide 2. 35:1 CinemaScope look, so that's the route I went.
Shooting in anamorphic
Almost as soon as I began shooting I realized there were more choices to make. Should I shoot Raw or JPEG? Would it be better to use the iPhone's built-in camera app or a third party app designed for anamorphic lenses? Let the experimentation begin!
Use the slider to compare the desqueezed image (L) with the squeezed image (R). The desqueeze process can be performed automatically by an apps, or in post-processing with a program like Photoshop.
The built-in camera app was the easiest way to get started, and ensured that I was taking advantage of all the wizardry of the iPhone's computational photography. However, there was one downside: there's no way to desqueeze the image in-camera. The image is always compressed horizontally, so you need to pre-visualize what the desqueezed photo will look like when framing a shot.
It's not difficult, but it's still not as natural as viewing a desqueezed image in real time, so I tried a couple third party apps designed to do just that: Filmic Firstlight (iOS, Android) and Moment Pro Camera (iOS).
Both are feature-rich photography apps that display a desqueezed image preview when shooting and include useful tools like manual controls, focus peaking, zebras, Raw image capture and the ability to export TIFF files.
The Moment Pro Camera app provides a real time desqueezed image preview, making it easier to compose photos.
The most noticeable difference I found is that the Moment app obscures parts of the image behind various camera controls, whereas the the Filmic app does not. As a result, I slightly preferred the Filmic app, but beyond that one issue they provide similar feature sets. They're both good apps and the one you prefer will mostly come down to personal preference.
The Filmic Firstlight app provides similar functionality to the Moment app, but doesn't obscure your image behind the camera controls.
Workflow and image quality
The workflow is far easier with third party apps since you can see what your final image will look like when shooting, and photos are desqueezed before being saved to the camera roll: no additional work required.
In contrast, photos shot using the built-in camera app require an additional processing step to desqueeze them. It was easy enough to create a Photoshop action to do this in bulk, but it meant a little extra work and some delayed gratification.
The workflow is far easier with third party apps since you can see what your final image will look like when shooting.
After experimenting with various combinations of app, file format and desqueeze methods, I learned some useful things:
The Filmic and Motion apps are more fun to shoot with thanks to real time previews of the anamorphic image. It's more intuitive and you don't need to imagine what the final shot will look like. They also make it easy to share photos in the moment instead of waiting until later.
Anamorphic accessory lenses allow you to capture classic anamorphic characteristics like horizontal lens flares.
iPhone 11 Pro with Moment anamorphic lens.
For the most part, desqueezed Raw images generally didn't look any better than JPEGs from the iPhone's native app, even after being stretched out. I expected this for photos taken in low light since the native app can do some exposure stacking, but it turned out to be true in most of the comparisons I tried.
Images captured with the native iPhone app and desqueezed in Photoshop generally looked a tiny bit better than the files from the Filmic and Motion apps. It's possible the those apps don't have access to quite the same computational wizardry as the native app, or it might just be that Photoshop does a better desqueeze.
Either way, the differences aren't significant. As a result, I often found myself using the third party apps for a more enjoyable experience.
The greatest limitation on image quality are the lenses themselves. They're really intended for video use, so it feels a bit unfair to judge them critically as still lenses. Keeping that in mind, you're going to see flaws that wouldn't be nearly as noticeable in a moving image.
iPhone 11 Pro with Moondog Labs anamorphic lens.
Overall, the Moondog and Moment optics performed similarly; as with any accessory lens, neither provides the level of optical clarity found on your smartphone's built-in lenses. Once you add a desqueeze step that stretches the image horizontally, you're going to start seeing artifacts. In fact, if you pixel peep the images in this article you’ll almost certainly be disappointed
Final thoughts
None of the anamorphic photos I shot with these lenses will win awards for technical image quality, but that really wasn't the point of the experiment. Using them forced me to think differently about the way I composed and framed shots, and that's always a good creative exercise.
Ignoring the optical limitations of the lenses for still photography, I really like the wide, cinematic aspect ratio. I was also pleased that I was able to provoke at least one of the distinctive characteristics anamorphics are known for, horizontal lens flare.
Now, couldn't you just use the widest angle lens on your smartphone and crop to 2. 35:1? Of course you could, but it won't look quite the same. You'll often hear cinematographers talk about the characteristics of a particular lens instead of how technically perfect it is, and even on a smartphone these anamorphic lenses result in a different look than you'll get by cropping. Is it technically perfect? Definitely not, but it can be a lot of fun to visualize the world in a slightly different way.
iPhone 11 Pro with Moment anamorphic lens.
What this experience taught me is that I want to shoot more photos using anamorphic lenses. It's not something that a lot of people do, but it challenges your creativity and presents an opportunity to create unique images. For my next experiment, I'm planning to kick it up a notch and pair a larger anamorphic lens with a mirrorless camera. That should allow me to take better advantage of unique anamorphic characteristics related to depth of field.
Want to try this this yourself? It's a fun experiment that you can do on your own. All you need is an anamorphic accessory lens and a case with a compatible mount. In addition to the Moondog and Moment lenses I tried, there are similar lenses available from Sandmarc, BeastGrip and Ulanzi, and cases from Moment, RhinoShield and Sirui. If you give it a try let me know how it works and send me a link to your photos!
View the full anamorphic sample gallery
. dpreview.com2020-4-3 17:00