...
Adobe Express and Photoshop showing AI assistant panels with prompt-to-edit, Firefly layers, and a Premiere Pro object mask.

“Talk to Edit”: Photoshop Gains AI Assistant as Adobe MAX Showcases Firefly 5

Share

Adobe has introduced new AI assistants in Express and Photoshop and placed the tools into the workflows as assistants that reside within daily work. The company positioned the release as one of the steps to conversational editing that would reduce the distance between idea and export. Executives also highlighted early experiments that connect assistants across apps and route projects from chat into design surfaces. 

Express assistant introduces a distinct “assistant mode.” Editors can switch into a text-prompt interface to generate images and page layouts, then return to standard controls for precise tweaks. The design targets students and professionals who prefer drafting with language before refining with knobs and sliders. The toggle preserves control while keeping AI within reach of novice users.

READ ALSO: How to Safely Download Adobe Creative Cloud and What’s Included

Adobe AI assistants, Photoshop Express

Photoshop assistant came in a closed beta. Adobe had it in the standard sidebar to enable creators to have the canvases visible. The assistant reads layers, comprehends object boundaries, and suggests what to do. Group requests may include requests to choose, put on a mask, take away the background, swap colours, or clean up on routine tasks. Product managers have indicated that this is aimed at stopping the process of hunt and click. They stressed that the changes are not only visible but can be reversed within the layer stack of Photoshop.

Adobe reiterated that it would be guidance in the apps and not a dislodged chat window. In Express, the company chose a mode that appears as prompts on the entire screen. In Photoshop, it would retain the assistant as a side dock. According to leaders, these decisions are a reflection of various user behaviors.

Templates and social formats are usually the starting point for express users. Photoshop users desire layers and masks to be controlled at the granular level. Both assistants are trying to understand the context to make suggestions that are based on the existing document.

Adobe AI assistants in Photoshop and Express enabling prompt-based edits and quick exports.

READ ALSO: PayPal Jumps Into ChatGPT Checkout—Turning AI Chats Into One-Tap Shopping

Photoshop assistant enters beta

The generative platform of Adobe has also expanded. The company claimed Firefly is now enabled with the ability to control images in layers and with finer region edits. It introduced native sound and speech creation in the studio in order that a storyboard may consist of tone and timing.

To video creators, a Premiere Pro object mask was added with AI. An editor is able to choose an individual or an object and use effects or color edits that follow a clip. The firm placed these features as building blocks to fast-to-publish pipes.

Early coordination of cross-assistance was described by Adobe in a program known as Project Moonlight. Based on product leadership, Moonlight tries to route its tasks across assistants across apps and match the outputs to the style of a creator. The company indicated that it was in a private beta.

It also augmented that the initial prototypes comprehend brand indicators, patterns of social channels to ensure assistants are able to modify formatting and tone in preserving visual identity.

READ ALSO: YouTube, Twitch, Vimeo — Three Platforms, Three Philosophies, One Creator Dilemma

Firefly upgrades and Premiere

The generative platform of Adobe also expanded. The company claimed Firefly is now enabled with the ability to control images in layers and with finer region edits. It introduced native sound and speech creation in the studio in order that a storyboard may consist of tone and timing.

To video creators, a Premiere Pro object mask was added with AI. An editor is able to choose an individual or an object and use effects or color edits that follow a clip. The firm placed these features as building blocks to fast-to-publish pipes.

Early coordination of cross-assistance was described by Adobe in a program known as Project Moonlight. Based on product leadership, Moonlight tries to route its tasks across assistants across apps and match the outputs to the style of a creator. The company indicated that it was in a private beta. It also augmented that the initial prototypes comprehend brand indicators, patterns of social channels to ensure assistants are able to modify formatting and tone in preserving visual identity.

READ ALSO: YouTube, Twitch, Vimeo — Three Platforms, Three Philosophies, One Creator Dilemma

ChatGPT handoffs and models

The path to connect Express with ChatGPT using OpenAI’s app integrations API. In that model, users would generate a design inside a chat session and open it in Express for edits or export. The company described this as exploratory work. Teams want to test whether chat-initiated assets lower friction for creators who brainstorm in conversational tools. The plan complements the app-native assistants rather than replacing them. The company also opened Photoshop’s generative fill to third-party models. 

Product managers cited support for Google’s Gemini 2.5 Flash and Black Forest Labs’ FLUX.1 Kontext Pro. Editors can choose a model for fill tasks such as removing objects or extending a scene. Adobe said choice matters for enterprise and agency teams that need specific visual characteristics or performance trade-offs. The approach keeps Photoshop as the editing surface while models act as interchangeable engines.

Diagram showing ChatGPT handoff to Adobe Express with model selection in Photoshop.

READ ALSO: Gemini Is Google’s First All-in-One AI Assistant — And It’s Changing Everything

Control, memory, and trust

Performance and control were highlighted as some of the leadership themes. Express has an assistant mode, which promotes rapid writing but does not lock you into a final piece of writing. Customers are able to change back, refine type, space between tunes, and even improve layers using common controls.

The Photoshop sidebar assistant does not override choices but only gives advice. It sends back choices and masks, and creators determine how to compose, blend, or grade. Adobe referred to this as assist and hand back, rather than replace the editor.

Workflow memory is also included in the generative roadmap of Adobe. The firm indicated that assistants will be taught through steps that they repeat over time. In case a group frequently eliminates product backdrops and sends a square, the assistant can propose that order.

When a social team has a set color profile and caption style, they can pre-format assets when exporting them. This is aimed, managers say, at compressing normal routine patterns without concealing the tools at work of professionals.

READ ALSO: Major Tablet Brand Breaks Tradition: Android-Powered Model Coming in 2026

Adoption patterns and roadmap

The company positioned these releases as part of agentic workflows that are nonetheless familiar. Assistants read the document, offer some actions, as well as perform short runs of tasks. Editors continue to decide. According to the updates by Adobe, early reactions by internal and external testers indicated time gains on the repetitive work.

The company will increase the betas, get usage data, and optimize the prompts and hand-offs using real-life projects. Another observation that Adobe leaders made is that users will adopt different products at varying rates. Students can write pages of work in the Express assistant mode. Moonlight could be used to route concept, copy, and layout social teams to stay on brand. 


Share