Live from Adobe Max: a sneak peek at some potential future Photoshop features

Live from Adobe Max: a sneak peek at some potential future Photoshop features
ФОТО: dpreview.com

Adobe Sneaks 2025 Disclosure: DPReview is attending Adobe Max, with Adobe covering travel and lodging expenses. This year at Adobe Max, the company held its traditional "Sneaks" show, where the engineers show off tech demos of features they've been working on that may eventually make it into production.

As an example, last year it showed off a feature called "Perfect Blend," meant to automatically match lighting and color between layers in Photoshop. This week, it officially launched as a feature called "Harmonize. " We had the chance to catch the show live, getting a preview of some of Photoshop's potential future features.

Project Surface Swap

The car was originally a goldish-yellow, but the presenter was able to select it with a single click and change the color using Photoshop's standard tools.

The first Photoshop "sneak" was for something called "Project Surface Swap. " It's essentially a more context-aware selection tool; in the demo the presenter used it to change the color of a car, despite it not being particularly easy to select with Photoshop's traditional selection tools because of the reflections and shadows on the paint. He also showed off the ability to select a wooden countertop in a photo, despite it being covered with other wooden objects, like a sushi rolling mat and a cutting board.

As the name implies, the tool was also able to load a reference image and use it as a texture, automatically applied with the correct perspective and tiling. When making selections it could take some factors, like varying opacity, into account and, like other selection tools in Photoshop, could be used for creating masks. .

Project Light Touch

Project Light Touch is a tool that allows the user to "relight" a scene, imagining what a photo would look like had a light been turned on, rather than left off, or if the lighting had been diffuse instead of harsh sunlight that cast shadows. It also let the presenter create a virtual light that would interact with the entire scene. He, of course, demoed it by adding a virtual candle to a picture of a carved pumpkin, showing it first lighting the outside of the pumpkin, and then lighting it from the inside, taking depth and occlusion into account.

Project Trace Erase

If I were betting, Project Trace Erase seemed like the demo most likely to turn into a real feature from this year's presentation. It builds on top of the existing Remove Tool, letting the presenter select an object from a photo with a single stroke, rather than having to make multiple passes to take out the object, its shadows, and reflections. Part of the demo involved removing a person walking down a sidewalk, along with their shadow and reflection in a glass wall. Even more impressively, the presenter used it to remove a stove and the smoke it was emitting, a lamp and the light it was putting, and a lens flare that was washing out essentially the entire frame.

The demo showed an impressive amount of contextual awareness. When the presenter selected a person walking in a snow, it also automatically removed the footprints they had left, and did something similar with a jet skier's wake. Each demo took just a few seconds, and didn't require particularly precise selections.

Project Clean Take

On the video side, Adobe showed off a a demo called "Project Clean Take," which focused on improving audio in a variety of ways. In the demo, the person was able to fix some incorrect dialogue by editing the transcript, and then "regenerating" the speech with a convincing simulation of the presenter's voice (despite the clip only being a few seconds long). They also showed off the ability to completely replace an audio track with a regenerated one, changing the presenter's voice to sound more enthusiastic, or to make it sound like they were speaking in a whisper.

The software was also able to take a single audio track and break out separate components, such as the presenter's voice, background sounds, and music playing in the background. * That not only allowed the presenter to mute or tamp down on unwanted noise, but replace copyrighted music with something that sounds similar from Adobe Stock, and then automatically match the reverb and other acoustic properties that were present in the original clip.

* Fun fact: Peter Jackson (yes, that Peter Jackson) used similar tech on a demo that John Lennon recorded, which is how we got the new Beatles song "Now And Then" in 2023.

Project Frame Forward

There was also Project Frame Forward, which showed off a feature that I've personally dreamed about forever. The presenter took a video, made an edit to the first frame in Photoshop, then brought the edited frame back into the tool. It was then able to apply the change he made to the rest of the frames in the video.

Unsurprisingly, the tool relies on generative AI, not tracking and masking, which allowed it to remove particularly tricky subjects like a car ripping around a track, billowing smoke from its tires. It was also able to handle multiple edits, including replacing the sky, changing white balance, and removing multiple people from the background.

A reminder

Again, these features aren't necessarily coming to Photoshop, Lightroom or Premiere anytime soon, or even at all. But they do offer an interesting look at some of the interesting projects Adobe's engineers are working on that may someday become features in the software many of us use.

You can watch the entire presentation on Adobe's YouTube channel.

.

was presenter project photoshop

2025-10-30 05:42