Adobe’s latest AI experiment generates music from text

Estimated read time 2 min read


This week, Adobe revealed an experimental audio AI tool to join its image-based ones in Photoshop. Described by the company as “an early-stage generative AI music generation and editing tool,” Adobe’s Project Music GenAI Control can create music (and other audio) from text prompts, which it can then fine-tune in the same interface.

Adobe frames the Firefly-based technology as a creative ally that — unlike generative audio experiments like Google’s MusicLM — goes a step further and skips the hassle of moving the output to external apps like Pro Tools, Logic Pro or GarageBand for editing. “Instead of manually cutting existing music to make intros, outros, and background audio, Project Music GenAI Control could help users to create exactly the pieces they need—solving workflow pain points end-to-end,” Adobe wrote in an announcement blog post.

The company suggests starting with text inputs like “powerful rock,” “happy dance” or “sad jazz” as a foundation. From there, you can enter more prompts to adjust its tempo, structure and repetition, increase its intensity, extend its length, remix entire sections or create loops. The company says it can even transform audio based on a reference melody.

Adobe says the resulting music is safe for commercial use. It’s also integrating its Content Credentials (“nutrition labels” for generated content), an attempt to be transparent about your masterpiece’s AI-assisted nature.

“One of the exciting things about these new tools is that they aren’t just about generating audio—they’re taking it to the level of Photoshop by giving creatives the same kind of deep control to shape, tweak, and edit their audio. It’s a kind of pixel-level control for music,” Adobe Research scientist Nicholas Bryan wrote.

The project is a collaboration with the University of California, San Diego and the School of Computer Science, Carnegie Mellon University. Adobe’s announcement emphasized Project Music GenAI Control’s experimental nature. (It didn’t reveal much of its interface in the video above, suggesting it may not have a consumer-facing UI yet.) So you may have to wait a while before the feature (presumably) makes its way into Adobe’s Creative Cloud suite.



Source link

You May Also Like

More From Author

+ There are no comments

Add yours