Adobe privacy crisis deepens as users threaten “Nightshade” attacks

"They think they can use anything you upload for AI training purposes? I'm filling up my entire 20 GB storage with poisoned images..."

Adobe has spoken out to address fears that it is training Generative AI on users’ IP and content. 

But its privacy update has failed to quell anger amongst the Photoshop maker’s userbase, with a number of customers threatening to launch virtual revenge attacks and others taking to Twitter to confirm they have either cancelled their accounts or are planning to do so. 

Last week, we reported that Adobe had joined Microsoft in admitting it is basically spyware.

Since then, Adobe's stock price dropped from roughly $475 at the end of May to a low of $454 on Monday.

It asked users to accept updates to its Terms of Use, causing them to spot changes from February 2024 that highlight its right to “access your content through both automated and manual methods.”

In a blog post, Adobe admitted that its decision to roll out a “re-acceptance” of its Terms of Use led “to concerns about what these terms are and what they mean to our customers.”

It is now speaking to customers - directly, in some cases - to tell them it has “never” trained generative AI on their content, taken ownership of their work, or facilitated access to content “beyond legal requirements.” 

It will release a further update on its terms by June 18, 2024. 

“You own your content,” Adobe wrote on a blog published on June 10. “Your content is yours and will never be used to train any generative AI tool. We will make it clear in the license grant section that any license granted to Adobe to operate its services will not supersede your ownership rights.”

Adobe clarified that its Generative AI model Firefly would only be trained on public domain content (if the copyright has expired) or on a dataset of licensed content such as Adobe Stock, which collects stock imagery.

It also reminded users that “you have a choice to not participate in our product improvement program."

“We may use usage data and content characteristics to improve your product experience and develop features like masking and background removal among others through techniques including machine learning (NOT generative AI),” it continued. “You always have the option of opting out of our desktop product improvement programs.”

The licenses needed to operate Adobe will now include “plain English examples of what they mean and why they are required”, emphasising that “in no case do these license grants transfer ownership of your content to Adobe."

Adobe reminded users that it automatically scans content they upload to ensure it is not hosting any child sexual abuse material (CSAM).

“In a world where customers are anxious about how their data is used, and how generative AI models are trained, it is the responsibility of companies that host customer data and content to declare their policies not just publicly, but in their legally binding Terms of Use," Adobe wrote.

“Our updated Terms of Use, which we will be releasing next week, will be more precise, will be limited to only the activities we know we need to do now and in the immediate future, and uses more plain language and examples to help customers understand what they mean and why we have them.”

However, the privacy firestorm has not yet abated. Angry artists have threatened to fill up Adobe’s servers with images “poisoned” with Nightshade, a tool which breaks AI models.

Nightshade works by “shading” images down to the pixel level so they appear totally different to an AI. If they are used to train a model, it will behave unpredictably by, for instance, showing an image of a handbag when asked to generate pics of a cow floating through space. 

Eduardo Valdés-Hevia, an artist with almost 200,000 X followers, tweeted: “I don't use Adobe's Creative Cloud because I like keeping my files locally.

“But seeing how they think they can use anything you upload there for AI training purposes, I'm filling up my entire 20 GB storage with random Nightshade-poisoned images.

“I encourage others [to] do the same!”

Alton Brown, an American television personality, was one of the prominent people who raised the prospect of closing down their Adobe account. 

On X, he told his 4.3 million followers: “My company is suspending use of all @Adobe products until our attorneys have had time to thoroughly review the user agreement changes.”

Many other people tweeted about trying to leave Adobe - only to be charged a cancellation fee.

The controversy has put the spotlight on potentially problematic aspects of Adobe's terms - some of which users may simply not have noticed before.

People were concerned about several aspects of the Adobe conditions, such as the line: “You grant us a non-exclusive, worldwide, royalty-free sublicensable, licence, to use, reproduce, publicly display, distribute, modify, create derivative works based on, publicly perform and translate the content.” 

Sasha Yanshin, a YouTuber with a maths degree from the University of Oxford, told. his 18.7 X followers: “I just cancelled my Adobe licence after many years as a customer.

"The new terms give Adobe "worldwide royalty-free licence to reproduce, display, distribute" or do whatever they want with any content I produce using their software.

“This is beyond insane. No creator in their right mind can accept this .

“You pay a huge monthly subscription and they want to own your content and your entire business as well.

“Going to have to learn some new tools.”

A similar clause already existed in the terms. A Wayback Machine showed it was present as long as three years ago, when Adobe confirmed: “When you upload Content to the Services or Software, you grant us a nonexclusive, worldwide, royalty-free, sublicensable and transferrable licence to use, reproduce, publicly display, distribute, modify (so as to better showcase your Content, for example), publicly perform and translate the Content.”

Get in touch with jasper@thestack.technology to let us know how you're responding to the Adobe privacy crisis.