How calculation software affects composition

A recurring theme with computer photography is the tension over creative choices. The “AI” features of cameras and photo editing applications can take care of many technical aspects, such as how to expose a scene or the focus of a fingernail. Does it take too much creative control from the photographers? (See my last column, “You already use digital photography, but that doesn’t mean you’re not a photographer.”)

The composition of the image seems to be outside of this tension. When aligning a shot, the camera does not physically pull your arms to point the lens to a better layout. (If it’s pulling your arms around, it’s time to consider a lighter camera or a sturdy tripod!) But software can affect composition in many circumstances, primarily during editing, but in some cases while shooting. of view as well.

Think about the composition

At first glance, the composition seems to be the simplest part of the photograph. Point the camera at a subject and press or press the shutter button. Experienced photographers know, however, that to choose where the subject appears, how it is composed in the viewfinder / screen, and even what element is the subject, involves more work and consideration. In fact, composing is often really the hardest part of capturing a scene.

So where does computer photography fit into this framework?

In some ways, more advanced AI features like HDR are easier to use. The camera, almost exclusively in a smartphone, captures ten shots at different exposures in a matter of milliseconds and then combines them to build a well-lit photo. It pulls together a wide range of data and merges it all together.

To apply artificial intelligence to composition, the camera needs to understand what’s in the viewfinder as well as you do. Part of this happens: smartphones can identify when a person is in the frame, and some can recognize common elements such as the sky, trees, mountains, etc. But that doesn’t help determine which foreground object should be prominent and where it should be placed relative to other objects in the scene.

In addition, when shooting, the photographer is always the one who controls the orientation of the lens. Where is it? I joke that the camera slides your arms into position, but that doesn’t come close to describing what many camera gimbals do. In addition to minimizing camera movement for smooth movement, a gimbal can identify a person and keep them centered in the frame.

Apple’s Center Stage feature automatically keeps the subject in the center of the frame, even if it moves. Apple

But back to the camera itself. If we are in control of the body and where it is pointed, all computer help should come from what the objective can see. One way to do this is to select a composition from the pixels that the sensor records. We do this all the time during crop editing, and I’ll get to that in a moment. But let’s say we want an algorithm to help us figure out the best lineup in front of us. Ideally, the camera would see more than what is presented in the viewfinder and choose a portion of those pixels.

Well, that happens, too, in a limited way. Apple last year introduced a feature called Center Stage in the 2021 iPad, iPad mini, and iPad Pro models. The front camera has an ultra-wide 120-degree field of view, and Center Stage software only reveals part of it, making it look like any normal video image. But the software also recognizes when a person is in the photo and adjusts that visible area to keep it centered. If another person enters the space, the camera appears to zoom out to include them as well. (Another example is the Logitech StreamCam, which can track a single person left and right.) The effect is a bit slippery, but the movement is quite smooth and will definitely improve.

Composition in edition

Apple Photos on the iPhone 13 Pro displays an Auto button when it detects people in the image (left). Auto-cropping is applied to the image on the right. Jeff Carlson

You’ll find more auto-framing options in editing software, but the results are more hit and miss.

In the macOS Photos app, clicking the Auto button in the crop interface only applies the straightening, which in this case rotated the image too far to the right.
In the macOS Photos app, clicking the Auto button in the crop interface only applies the straightening, which in this case rotated the image too far to the right. Jeff Carlson

The concept reflects what we do when we sit down to edit a photo: the app analyzes the image and detects objects and types of scenes, then uses that information to choose alternative compositions by cropping. Most of the apps I chose as examples use faces as the basis for redialing; Apple’s Photos app only presents an Auto button in the crop interface when a person is present or when an obvious skyline suggests the image needs to be straightened.

Luminar AI did a good job cropping the panel to the right.
Luminar AI did a good job cropping the panel to the right. Jeff Carlson

I expected better results in Lamp AI using its AI Composition tool, because the main feature of Luminar AI is the fact that it analyzes each frame when you start to edit it to determine the content. It did a good job with my landscape test photo, keeping the car and driver in the frame with the rocks on the top left and removing the panel on the bottom right. However, in the portrait (below) it did the opposite of what I hoped, keeping the woman’s fingers in sight on the right and cutting her hair on the left.

Luminar AI appears to have emphasized each face along the rule of thirds guides, but kept the distracting fingers to the right.
Luminar AI appears to have emphasized each face along the rule of thirds guides, but kept the distracting fingers to the right. Jeff Carlson

I also threw the two test images to Pixelmator Pro on macOS, since Pixelmator (the company) has been aggressive in incorporating machine language-based tools into its editor. He framed the two women in the portrait (below) well, although my preference is not to crop as tightly as he did. In the landscape shot, it was cropped slightly, which only improved the photo slightly.

Pixelmator Pro's auto-crop feature used on a portrait
Pixelmator Pro’s crop emphasizes female faces, although I shift the crop boundaries to the left for better balance. Jeff Carlson

Adobe Lightroom and Photoshop do not include any automatic cropping feature, but Photoshop Elements offers four suggested compositions in the tool options bar when the Crop tool is selected. However, it is not clear what the app uses to come up with these suggestions as they can be quite hit and miss.

The Auto Crop feature in Adobe Elements
Adobe Elements also offers automatic cropping, but the results can be unpredictable. Jeff Carlson

Comp-u-sition? (No that’s a terrible term)

It is essential to note here that these are all starting points. Either way, you are presented with a suggestion which you can then manipulate manually by dragging the crop handles.

Perhaps this is the lesson to be learned today: Computer photography isn’t just about letting the camera or software do all the work for you. It may give you options or save you time, but at the end of the day, the choices you make are yours. Sometimes seeing a bad crop suggestion can help you figure out which elements in the photo are working and which need to be removed. Or, you can apply an automatic adjustment, disagree (perhaps vehemently), and make your own creation.

Stewart C. Hartline