Skip to content
Join our Newsletter

Is it art, or is it stealing work? Album cover designers stare down an AI future

TORONTO — Finger Eleven guitarist James Black has picked up a new instrument, one that pushes the boundaries of his visual imagination — generative artificial intelligence technology.
836c013dffeb751e7dc453c973a9ba746fc411816918cb886610118421a16f56
Finger Eleven guitarist James Black is pictured with artwork for the band’s singles that he created using AI, in Toronto, on Monday, June 2, 2025. THE CANADIAN PRESS/Laura Proctor

TORONTO — Finger Eleven guitarist James Black has picked up a new instrument, one that pushes the boundaries of his visual imagination — generative artificial intelligence technology.

The Toronto musician and graphic artist admits it's a controversial choice, but over the past year, he's been using the tool to help design his band's new album covers. Each one showcases grand concepts, stunning imagery and ultimately a piece of art that demands attention in an era where all musicians are jostling to stand out.

"We're in the blockbuster age where people like to see big, big things," Black says from his office.

"Whenever I have an idea, it’s usually something beyond what we have the resources to do, and AI means you don’t have to put a lid on those ideas.”

His work usually starts with typing a few descriptive words into AI software and collecting the images it spits back out. Then, he uses photo editing to fine-tune his favourites so they fit his original vision. Sometimes, he submits those altered images back into the AI to generate more ideas.

"There's quite a bit of back-and-forth where you're applying your own skill and then putting it back in," he said.

"It's a little bit like arguing with a robot. You have to nuance it into doing what you want."

One of his first experiments was the cover artwork for Finger Eleven's 2024 single "Adrenaline."

The illustration shows a curvaceous woman in a skin-tight red-and-white-racing suit, her head concealed under a motorcycle helmet. She's standing in the middle of a racetrack with her back to the viewer. A cloudy blue sky imparts an otherworldly calm.

Anyone who's seen recent AI artwork will probably recognize the hyperrealistic sheen of its esthetic. Other familiar AI trademarks are there too, including a landscape firmly rooted in a dream world.

Generative image models are trained on billions of photographs to learn patterns, such as recurring shapes and styles. They then use that information to construct images that can often seem familiar. Many fear that the tools also draw from copyrighted pieces without permission from their creators.

It's a legal quagmire that only skirts the surface of the ethical debate around generative AI models. Beyond the copyright risks, critics fear the technology will cost album cover designers and photographers their jobs.

But AI programs such as NightCafe, CoPilot and Adobe Firefly offer cutting-edge possibilities that many artists say they can't ignore.

Still, Black said he understands there are ethical concerns.

“I'm definitely torn myself,” he said. “But I'm using it because it extends as far as my imagination can go.”

Other musicians have found that generative AI answers the demands of a streaming industry that pressures them to churn out new music, eye-catching lyric videos and other visual elements regularly. But some fan bases aren't sympathetic to those reasons.

Last year, Tears for Fears was slammed on social media after they revealed the cover of their live album "Songs for a Nervous Planet," which had several familiar AI image traits.

The illustration shows an astronaut staring straight at the viewer, their face concealed under a space helmet. They're standing in the middle of a field of sunflowers that stretches into the distance. A cloudy blue sky imparts otherworldly calm.

The cover's creator, Vitalie Burcovschi, described it as "art created by AI using human imagination." But fans were quick to accuse the band of using AI that might have scraped copyrighted work. As blowback intensified, the English duo released a statement calling it "a mixed media digital collage, with AI being just one of the many tools used."

Pop singer Kesha encountered similar flak for the cover of her 2024 single “Delusional," which featured a pile of Hermés Birkin bags with the song’s name spray-painted across them.

Fans instantly recognized common flaws of an AI-created image: misspellings in the song's title, sloppy digital fragments. Some demanded she redo the artwork with paid photographers.

It took months, but the singer replaced the image with a photograph of herself tied to a chair. She assured fans it was created with an "incredible team of humans.”

"AI is a Pandora's box that we as a society have collectively opened, and I think it's important that we keep human ramifications in mind as we learn how to use it as a tool and not as a replacement," she said in an Instagram post in May.

Illustrator and musician Keenan Gregory of the band Forester says he used AI technology to extend the background of an old photograph so it could fit on the cover of the band's upcoming EP.

The original image for "Young Guns" was taken in the 1940s as a vertical photograph and showed bass player Dylan Brulotte's grandfather strolling through the streets of Edmonton. Gregory needed a square shape for the album cover, so he put the shot into Photoshop's generative AI tool, which artificially extended the frame's left and right edges with more detail.

He removed certain background elements, like storefront signs, with a blend of traditional photo editing techniques.

"Typically, an artist would have to do that manually," he said. "But having AI provide you with options, which you then edit, is very powerful."

Gregory said he considers AI one of a photo editor's many tools, adding he didn't use it to make the cover for Royal Tusk's "Altruistic," which earlier this year won him a Juno Award for best album artwork.

Even when musicians are transparent about using AI, some fans are not ready to embrace it, as British Columbia rock band Unleash the Archers learned last year.

Vocalist Brittney Slayes said their concept album "Phantoma" told the story of an AI gaining sentience and escaping into the real world in the body of an android.

To explore the album's theme, Slayes said some of her songwriting drew inspiration from ChatGPT suggestions, while they used visual AI programs to create inspiration images for songs.

She said the band also filmed a music video for "Green & Glass" and then fed the finished product into an AI model trained on artwork by Bo Bradshaw — the illustrator for the band's merchandise. It spat out an AI-animated version of the video.

"We paid to license all of his artwork ... so he was compensated and he was credited," she said.

But the reaction was swift. Some listeners accused the band of theft, alleging that despite paying for Bradshaw's work, the AI tool likely used other unlicensed art to fill out the visuals.

"We didn't realize that even though our model was trained after one artist, the program was going to fill in the blanks with others," Slayes said.

"People didn't care. The second the word 'AI' was used, we were targeted. You know, the usual Twitter uproar, being like scraped across the internet as these terrible people that use AI in their music."

Unleash the Archers responded on their socials, issuing a statement acknowledging they had unintentionally implied their video featured original artwork by Bradshaw when it was actually produced through an AI program without his direct involvement.

Their statement recognized how fraught the risks are for bands eager to explore new technology, saying that "while we were expecting some controversy, we weren’t expecting as much as we got."

Slayes said the backlash has forever sullied her connection to the album, which she originally intended as an exploration of an inevitable AI future. Instead, to her, it's become a reminder of how fast-developing AI technology is provoking deep-rooted anxieties.

"People are still afraid of it," she said. "And for good reason, because it is taking jobs."

For other artists, she urges them to think carefully about how they introduce AI into their own projects:

"If you're going to use AI for your artwork, you've got to have a really good reason."

This report by The Canadian Press was first published June 10, 2025.

David Friend, The Canadian Press

$(function() { $(".nav-social-ft").append('
  • '); });