Adobe has officially kicked off its annual Max design conference, showcasing a sweeping lineup of new AI tools and updates across Photoshop, Premiere Pro, Lightroom, and Express.
The company’s Firefly AI platform took center stage as Adobe announced creative assistants capable of generating soundtracks, voiceovers, and even editing entire projects through simple text prompts.
Photoshop and premiere pro get smarter
Adobe’s flagship apps—Photoshop and Premiere Pro—are receiving significant AI upgrades designed to speed up the creative process. Photoshop’s Generative Fill feature now supports third-party models like Google Gemini 2.5 Flash and Black Forest Labs’ Flux.1 Kontext, alongside Adobe’s native Firefly image model.

This means users can instantly generate or replace objects, backgrounds, and even people within images while comparing outputs from multiple AI sources for more creative control. Premiere Pro, meanwhile, benefits from tools that automate tedious editing tasks—making it easier to cut, color-correct, and refine projects with minimal manual work.
AI audio tools bring soundtracks
Another highlight from Adobe Max 2025 is the debut of Generate Soundtrack and Generate Speech, two new Firefly AI tools designed to revolutionize video production.
The Generate Soundtrack tool—now in public beta—analyzes uploaded videos and automatically composes background music that syncs to the footage’s tone and pacing. Users can guide the mood by choosing from presets like lofi, hip-hop, EDM, or classical, or write prompts describing the vibe they want—such as “sentimental” or “aggressive.”

Meanwhile, Generate Speech enables users to create realistic voiceovers in seconds, turning written scripts into narrations that match the video’s style and emotion. Both tools are part of a new, web-based Firefly app that integrates AI features with a simplified editing timeline.
AI assistant for instant design edits
Adobe Express, the company’s lightweight cloud-based design platform, is also getting a major AI boost. A new AI Assistant—now in public beta—lets users edit or create projects simply by describing what they want.
Accessible via a toggle in the app’s top-left corner, the assistant replaces traditional design tools with a chat-style interface. Users can type requests such as “create a retro poster for a school science fair” or “edit this photo to match a fall wedding theme”, and the assistant automatically generates or modifies the design accordingly.
This feature aims to make content creation accessible to users of all skill levels, especially those unfamiliar with professional design software.
Project moonlight
Adobe also previewed Project Moonlight, an AI-driven “creative director” for social media campaigns. This experimental Firefly-powered agent connects to your Adobe apps and social media accounts, helping creators brainstorm ideas, edit visuals, and generate posts aligned with their unique brand voice.
By processing user descriptions through Adobe’s AI editing tools, Moonlight produces customized visuals, videos, and copy for platforms like Instagram, TikTok, and X. Adobe says this project reflects its broader vision of AI-assisted creativity, where intelligent agents work as collaborative partners rather than replacements for human designers.
As tradition, Adobe’s “Sneaks” segment will showcase experimental technologies still in development. These previews often hint at future features—like last year’s Project Perfect Blend, which evolved into Photoshop’s Harmonize compositing feature. Attendees can expect more prototype demos that highlight the next frontier in AI-enhanced creativity.


