New Adobe Terms

If you would like to post, you'll need to register. Note that if you have a BCG store account, you'll need a new, separate account here (we keep the two sites separate for security purposes).

Status
Not open for further replies.
For the last half-to-full year, there have been 'agents' on social media accounts asking users to justify why they haven't yet moved from LrC to LR Cloud. (Typical arguments between feature sets ensued, not germane to this post.)

The first agents openly admitted they represented Adobe.
tbh, i think Adobe is just trying to figure out why they haven't had more uptake in the cloud stuff
 
tbh, i think Adobe is just trying to figure out why they haven't had more uptake in the cloud stuff
They are sitting on a goldmine. Other AI companies scan the interweb. Cloud companies scan their clouds. Adobe is the only one that can scan private images that use their software and tools (eg Generative Fill), as well as their Stock.
 
For the last half-to-full year, there have been 'agents' on social media accounts asking users to justify why they haven't yet moved from LrC to LR Cloud. (Typical arguments between feature sets ensued, not germane to this post.)

The first agents openly admitted they represented Adobe.

After that first wave, it was people claiming to be independent or not openly mentioning they were representing any entity, but still seemed to use the same language and ask the same questions.

At the time, I thought "As Adobe is likely behind all of this, it's an attempt to sell more cloud storage" and ignored it, except for the annoying thoughts that LrC may one day go away (I don't store my content in the cloud, so if they drop LrC, I will drop Adobe ... probably for C1).

But NOW I'm starting to believe that these were possibly attempts to get more content in the cloud (from both recreational and commercial users) for reasons related to this thread. Suffice it to say that I'm concerned. And I will mostly certainly not "calm down", because I've spent decades in the corporate world and know how they think.

Chris

Regardless of which version you're on, some of the new tools require internet connection and are actually processed on Adobe servers and require scanning for the task to be completed. I don't see anything precluding their harvesting of these images.
 
I won't try to explain in details the terms just issued (then explained 24 houts later) by Adobe; my interpretation is that they are giving themselves the right to access and scan for AI everything that's on their clouds, including what we upload to Lightroom, Photoshop, Illustrator, etc. without our consent or ability to opt out. There has been ton of reporting on the topic in the last 48 hours, Google it.

Practically, I've been instructed by a couple of my agencies to stop using AI for retouching (inc. Generative Fill and that magic erase wand) AND start shipping some assets to them via disk, not (any) cloud. Reason for the former is that there is an assumption that using these tools is essentially entrapment (my word) and those photos are now fair game for AI scanning. Reason for the latter is that they are no longer confident that photos of products that haven't been released yet wouldn't end up somewhere prior to official release.

Over-reaction, maybe. I used to perceive Adobe as "good guys," but that perception is quickly eroding.
That's why I decided a long time ago to never use the cloud. Yes, I have to buy hard drives but I'm okay with that.
 
While there are legit reasons for them to access our files, wherever they are, they have clearly worded it to allow them to use our images to train Firefly. It may not matter to many; my agencies care and are taking steps to minimize this (legal) incursion.
Exactly and it will get sorted out. If anyone thinks that Adobe attorneys would let them post this when they are doing it anyway....well.....

I certainly don't having worked in that world. This is as clear as it gets.


  • Adobe does not train Firefly Gen AI models on customer content. Firefly generative AI models are trained on a dataset of licensed content, such as Adobe Stock, and public domain content where copyright has expired. Read more here: https://helpx.adobe.com/firefly/faq.html#training-data
  • Adobe will never assume ownership of a customer's work. Adobe hosts content to enable customers to use our applications and services. Customers own their content and Adobe does not assume any ownership of customer work.
 
Regardless of which version you're on, some of the new tools require internet connection and are actually processed on Adobe servers and require scanning for the task to be completed. I don't see anything precluding their harvesting of these images.
True. I actually haven't used those tools either, not because I'm dead set against it (I use some of the non-server reliant AI tools), but because I haven't needed it and also hear that the returned results are potentially lower resolution (this last point may not always been true, but I haven't yet found the opportunities to test it).

I have rallied against some of the more purist folk here regarding image modifications and composites, but in respect to this topic (AI generation), I seem to stand in a safe place with them.

However, I still stand in solidarity with my fellow artists in respect to protection of our properties.

Chris
 
Speaking from first-hand observation, if it doesn't say they won't, they will. And even if it does say, they still will.
i don’t think so. adobe has a vested interest in preserving their advantage of having properly licensed content to train their ai. polluting it with unlicensed content at this point would throw away a competitive advantage when everyone else starts getting hit with class action lawsuits
 
i haven’t really looked at it, but i kinda wonder if existing cloud enabled products already had these terms. i’ve wondered why the Content Credentials feature isn’t in LRc, and terms may have been a reason
 
i don’t think so. adobe has a vested interest in preserving their advantage of having properly licensed content to train their ai. polluting it with unlicensed content at this point would throw away a competitive advantage when everyone else starts getting hit with class action lawsuits
I guess I am just cynical. Someone (Nimi) said above that they are sitting on a gold mine and I agree. It doesn't need to be a corporate decision. All it takes is a person with the right permissions to seize the opportunity for short term personal gain.

Their wording is vague; Their internal checks and balances are...Oh yeah, what are they?
 
I guess I am just cynical. Someone (Nimi) said above that they are sitting on a gold mine and I agree. It doesn't need to be a corporate decision. All it takes is a person with the right permissions to seize the opportunity for short term personal gain.

Their wording is vague; Their internal checks and balances are...Oh yeah, what are they?
Sure thieves abound everywhere; but they also get caught or end up dead on NCIS.

As for Adobe they clear. Before people run for DXO, or Topaz or whatever, see if you can get the clarity there. @Nimi and others have raised good points and I am confident the agency and Adobe will get the clarification needed.

Where does Firefly get its data from?
The current Firefly generative AI models were trained on a dataset of licensed content, such as Adobe Stock, along with public domain content. As Firefly evolves, Adobe is exploring ways for creators to be able to train the machine learning model with their own assets so they can generate content that matches their unique style, branding, and design language without the influence of other creators’ content. Adobe will continue to listen to and work with the creative community to address future developments to the Firefly training models.
As an Adobe customer, is my content automatically used to train Firefly?
No, we don't train on any Creative Cloud subscribers’ personal content. For Adobe Stock contributors, the content is part of Firefly’s training dataset, in accordance with Stock Contributor license agreements.
 
I guess I am just cynical. Someone (Nimi) said above that they are sitting on a gold mine and I agree. It doesn't need to be a corporate decision. All it takes is a person with the right permissions to seize the opportunity for short term personal gain.

Their wording is vague; Their internal checks and balances are...Oh yeah, what are they?
but again, the gold mine isn’t that they handle your photos. the gold mine is they have a comprehensive, end to end strategy including properly licensed content, the technology, a big part of your tool chain, and content credentials to tie it all together.

remember everyone else is just stealing your ip and figuring they’ll be rich enough that in the end they’ll still be ahead or not be held accountable at all.
 
but again, the gold mine isn’t that they handle your photos. the gold mine is they have a comprehensive, end to end strategy including properly licensed content, the technology, a big part of your tool chain, and content credentials to tie it all together.

remember everyone else is just stealing your ip and figuring they’ll be rich enough that in the end they’ll still be ahead or not be held accountable at all.

The name of the game for Adobe is having the best text-to-image and text-to-video models that they incorporate into the creative suite. For that they need hundreds of billions of parameters. Like everyone else, they're scanning anything that moves, but they are far behind companies like OpenAi/Microsoft and Meta, and Amazon, Google and Apple haven't really shown their hands. The key is going to be private data. For example, Meta is scanning whatsapp (which they own) for text, images and videos. Adobe is sitting on a pile of private data, and they'll use it. I hear what they are saying, but I don't believe them; it's THAT lucrative and it's 100% legal. The licensing agreement we all entered into gives them the right, and their "terms" aren't legally binding; plus they reserve the right to change those. The legal framework isn't moving fast enough to address the scanning of copyrighted material, and no one is spending more money lobbying to maintain the status quo than the names above.

As @BarkingBeans Coffee says, it'll be settled at some point by lawyers and courts and Congress (good luck with China, which runs about half of the 60 LLMs out there). In the meantime, the AI companies are running the show, and while Adobe pretends to care, I don't believe they do.
 
Here is my take:

1. If you post online, your content has been scanned by LL models. Perfectly legal and there isn't a way to stop it.

2. Your scanned content is being used to create new content.

3. If you use AI tools (eg Adobe Generative Fill) that require an upload into the company's servers (eg Adobe Generative Fill) there is a very good chance those images are also being used for models.

This may or may not matter to you.
Nimi, this appears to allow a user to opt-out of allowing Adobe to use their cloud based content for product improvement and development purposes. I’m not a contracts expert, however.

IMG_8372.jpeg
You can only see EXIF info for this image if you are logged in.

 
Nimi, this appears to allow a user to opt-out of allowing Adobe to use their cloud based content for product improvement and development purposes. I’m not a contracts expert, however.

View attachment 90980

I'll try to find the article from April that points out all the loopholes built into this. Had to do with the definition of Creative Cloud as well as the last sentence that's obvious. I think it goes back to my first post here; agencies are instructing their pros to eschew using the AI features of LR/PS AND PR.
 
I'll try to find the article from April that points out all the loopholes built into this. Had to do with the definition of Creative Cloud as well as the last sentence that's obvious.

One loophole is Adobe’s liability for storing illegal content (we know what that is). They will have to screen for such content and alert the appropriate law enforcement authorities when they find it.

I think it goes back to my first post here; agencies are instructing their pros to eschew using the AI features of LR/PS AND PR.

I know that agency clients do not want their IP scanned, evaluated, and used by 3rd parties for any reason. For that reason my employer is developing its own AI tools (just for text and static images right now) using licensed technology. Those apps run only on secured company servers.
 
Last edited:
One loophole is Adobe’s liability for storing illegal content (we know what is). They will have to screen for such content and alert the appropriate law enforcement authorities.



I know that agency clients do not want their IP scanned, evaluated, and used by 3rd parties for any reason. For that reason my employer is developing its own AI tools (for text and static images right now) using licensed technology. Those apps run only on secured company servers.
It's very confusing times. AI-generated content is also not copyrightable, and now they are worried that tools like Generative Fill render the content public-domain. That's why we are not seeing AI-generated music, for example; there is no money in it for the publishers.
 
A point of confusion is that there is the actual, legal agreement - the End User License Agreement, which is full of precise legal language. That is what you are agreeing to.

The problem is that it's technical language (aka legalese) that is in large part incomprehensible to most users.

So the company puts out an "explainer" trying to summarize what the legal language means to the end user. This is not part of the legal agreement that you are accepting - the agreement is solely defined by the EULA. But often the explainer is oversimplified, or wrong. But people think that the explainer is the agreement, when it's not. And can be misleading.

E.g. in this thread, @Nimi is pointing to the actual agreement language - the EULA - and highlighting loopholes within it. OTOH this link:
https://helpx.adobe.com/manage-account/using/machine-learning-faq.html
is an explainer - it is not legally binding on Adobe. It's trying to inform the user, but may, or may not be the whole story. I think Adobe is trying to inform the users in good faith what their intentions are to their customers, but the EULA doesn't appear to match those intentions as of yet. So I expect the EULA to be revised (ok, they are *always* revised heh) to reflect those explainer statements. Hope I'm right!
 
A point of confusion is that there is the actual, legal agreement - the End User License Agreement, which is full of precise legal language. That is what you are agreeing to.

The problem is that it's technical language (aka legalese) that is in large part incomprehensible to most users.

So the company puts out an "explainer" trying to summarize what the legal language means to the end user. This is not part of the legal agreement that you are accepting - the agreement is solely defined by the EULA. But often the explainer is oversimplified, or wrong. But people think that the explainer is the agreement, when it's not. And can be misleading.

E.g. in this thread, @Nimi is pointing to the actual agreement language - the EULA - and highlighting loopholes within it. OTOH this link:
https://helpx.adobe.com/manage-account/using/machine-learning-faq.html
is an explainer - it is not legally binding on Adobe. It's trying to inform the user, but may, or may not be the whole story. I think Adobe is trying to inform the users in good faith what their intentions are to their customers, but the EULA doesn't appear to match those intentions as of yet. So I expect the EULA to be revised (ok, they are *always* revised heh) to reflect those explainer statements. Hope I'm right!
Maybe - in my 34 years of tax controversy if a state explained its own language in an Explainer as you called it, I can't recall a time they could walk that back. I believe Adobe's explainer would be held against them if they were caught training Firefly with customer images they would lose. Why because in contract law your behavior counts in interpreting a contract, and the Explainer is part of the behavior. And they have too much to lose by doing so. For now I continue to give them the benefit of the doubt. I believe they will clear things up with the agencies.
 
If Adobe is doing it, I’m sure C1, and others will follow.
And we dream for our cameras to have a live connection…
Imagine Nikon offering special subjects detecting tools while being connected…
 
If Adobe is doing it, I’m sure C1, and others will follow.
And we dream for our cameras to have a live connection…
Imagine Nikon offering special subjects detecting tools while being connected…
Joel, for several reasons I have no dream for active cellular communication on my cameras while I’m photographing wildlife. I don’t want to be interrupted by notifications, have the camera’s performance slowed by uploads/downloads, and there are locations I visit where mobile communication is intermittent or nonexistent. I’m fine with the current paradigm of subject detection intelligence provided via firmware update.

I realize the type of functionality you mentioned is inevitable, however.
 
Status
Not open for further replies.
Back
Top