Designing the Future with Adobe Firefly: A Revolution in Design Technology

A photo of a canvas bag that has a hummingbird on it with words underneath that say #AdobeFirefly

Adobe’s Relationship with Artificial Intelligence

Adobe has dabbled in artificial intelligence (AI) for over a decade. Heck, a ton of their products are based off of AI capabilities. Adobe Photoshop’s magic eraser tool, clone tool, and more are all examples of AI integrations within Adobe.

More commonly known as Adobe Sensei, these integrations have been around for years to help us as creatives, well, create. Sensei has been using AI and machine learning to make creating easier. Now, with the newest revolutions in creative generative AI technology, we’re getting a step up in these integrations.

This past week I got to take a deeper look into Adobe Firefly at Adobe’s exclusive in-person event in New York City.

Principal Director, Creative Cloud Evangelist Terry White, Product Marketing Lead for Adobe Live Clara Galán, Senior Staff Designer Kelly Hurlburt, and Experience Designer II for AI/ML Veronica Peitong Chen introduced Adobe Firefly to the crowd.

What is Adobe Firefly?

Currently, Adobe Firefly is described in one sentence: “Experiment, imagine, and make an infinite range of creations with Firefly, a family of creative generative AI models coming to Adobe products” on its website.

Right now, Adobe Firefly is still in its Beta testing (available in English only), open to only those allowed off of a waitlist. Luckily, I gained early access by attending Adobe’s exclusive event, and have had the chance to play around with it.

Currently, Adobe Firefly has three options for creators to choose from:

1. Text to Image. This feature allows creators to input a detailed text description, and Adobe Firefly will generate an image.

In the example below, I gave the prompt, “interior design living room with a lot of plants.” Within seconds, Adobe Firefly had generated four images for me. I then had the opportunity to customize and change the aspect ratio of images, content type (photo, graphic, or art), change the style of the image (synth wave, neon, chaotic, etc.), the lighting, and more.

Once the image was to my liking, I could download as a PNG, and then I was all set. This simple command generated beautiful realistic work to my liking within minutes.

A screenshot of Adobe Firefly's text to image features. There are four photos of tan couches on the image with lots of green plants around them.

2. Text Effects. Creators can apply unique styles to text and words by inputting a detailed description.

For my test, I put the description, “spring flowers, plants, greenery, and trees” explaining what I wanted to see in the word I had written. Within seconds, those exact things appeared within my letters.

I then had the ability to further customize my word, choosing how loose fit I wanted the effect to be, the font, color, and more.

A screenshot of Adobe Firefly's text effects feature. The word "Ephemia" is on screen with depth effects added with flowers and greenery around the letters.

3. Recolor Vectors. Released just this week on April 20, creators can now generate different color variations for previously created vector artwork based on a detailed text description.

First you upload an asset or SVG file, then write a description of the color tones you would like to see in the variations.

A screenshot of Adobe Firefly's recolor vectors page prompt that reads "generate color variations of your vector artwork"
A graphic with lots of shapes and colors
The same graphic with lots of shapes and colors, this time with a coral reef color theme

The Future of Firefly

Despite having these three amazing features, this is only the beginning for Adobe. The team expressed their great excitement in new and upcoming features.

Inpainting is an upcoming feature that will allow users to change parts of images based on detailed text descriptions. Have an astronaut on a moon and want to add a plant next to them? Select the area next to them and type a detailed description, next thing you know It there’s a plant!

A photo of a tv screen where Adobe is demoing their new Inpainting feature

3D Image Composer is another upcoming feature that will allow for rendering for architectural design, specifically.

A photo of a tv screen where Adobe is demoing their new 3D image composer feature

Lastly, Adobe ended the evening by showing us a video of their hopes for Adobe Firefly for Video. Want to automatically add b-roll to a video? Add colorful captions based on automatically transcribed audio? Add sound effects? All of this and more hopes to be possible with their coming-soon video integrations.

How to Access Adobe Firefly

Right now, Adobe Firefly is only available to Beta users who are approved. You can be added to the waitlist by visiting firefly.adobe.com. Additionally, interested users can join the Adobe Firefly Discord and chat with users from around the world.

The Adobe team sees these features as being integrated into our Adobe platforms in the future. For example, having the video capabilities in Premiere, etc. According to the team, their first rollouts hope to be on Photoshop and to expand from there.

While there is no date, Terry White confirmed that this is all coming “soon".”

Artificial Intelligence: Concerns?

One of the biggest concerns we have heard around is whether or not this new technology will be replacing designers and their field of work. As Adobe perfectly states it, AI opens creative possibilities. “AI and machine learning are here to handle the time-consuming parts of your job that can easily be automated, so you have more time to be creative.”

Adobe Firefly, as the team explained, is here to help with creativity, not replace it. Trained on Adobe Stock images, openly licensed content, and public domain images, Adobe Firefly is using content that Adobe has rights to, trying to curb a big fear of creators: that they are using someone else’s work without correct rights.

Terry White stated that Adobe is “doing our best to make sure [content is] for commercial use.”

Although a daunting step forward, these tools are revolutionary – and it is only the beginning.

Next
Next

Say Goodbye to Noisy Photos: Why Lightroom's New Denoise Feature is a Game-Changer for Photographers