AI: the Worse Enemy of Photography

If you would like to post, you'll need to register. Note that if you have a BCG store account, you'll need a new, separate account here (we keep the two sites separate for security purposes).

I think what it will come down to is that there is probably going to be a need for some kind of compromise between a truly difficult to spoof or crack system and one that can function with the sort of agility across multiple platforms and a lot of different sorts of standalone devices. You could make an extremely difficult to beat system, but I'm not so sure you can do so in a way that works within the sorts of practical realities necessary for working photographers.
i think that it's going to be pretty transparent, but with hardware limitations. from a practical perspective, your camera is going to have a cert and it's going to mix it with the image and image metadata to provide the needed authentication signature. this should be transparent, but it will require hardware capable of doing this fingerprint fast enough to not interfere with the performance. this means that it's probably only going to be available on high end cameras that have the horsepower needed to do this until the technology can trickle down.

The other issue is this: if this is going to work, it's going to mean there will have to be a standard for how these authentication keys or whatever work - but if there's a standard then that means that anyone with the sufficient coding skills could just create software to create keys that follow that standard.
yes, it's going to be a standard. this is where the CAI comes in. and in general transparency only helps cryptographic integrity. most strong encryption technologies are open standards based.

Another issue here is with social media sites and the way they handle images. Facebook, Instagram, presumably Twitter (or X, or whatever it is), etc. don't just post the original files people upload, but take them and process them into altogether different photos, stripping the metadata in the process. This is probably true of most media sites as well. They could in theory change the way things work so that metadata is preserved, but this then gets into all sorts of questions about whether any functional authentication system would be able to allow "transfers" or the authentication.
if you look at how Adobe has talked about CAI, the process should apply to sites like social media companies. my understanding (and admittedly, i only glanced at new releases is the way it work) is there's basically a chain of custody, sort like:

original photo (fingerprint1) ----> lightroom made this change (fingerprint2) ----> photoshop made this change (fingerprint3)

and thus, when you look at fingerprint3, you can trace it back to photoshop modifying fingerprint2

so there is no reason it can't be:

original photo (fingerprint1) ----> lightroom made this change (fingerprint2) ----> photoshop made this change (fingerprint3) ---> fb resized (fingerprint4)

of course, this would require them to do it, but i do think it can happen. i don't know if folks remember, but fb used to strip the copyright, and they caught a lot of flack for that and now they no longer do that.

ymmv.
 
this is already and has been a problem.

we start out and we compare ourselves to those who are crazy talented, and compare our every photo to their BEST photos.

then add on top of that any editing (let's anchor that to Ansel Adams, just so we don't think that's a "new" thing).

our monkey brain says "ooh, that's great, i want to do THAT". then we try to do that, perhaps not really internalizing that what we're comparing to is both the apex of accomplishment, but also perhaps... not real at all
I'm concerned about the time when everyone's work looks like they're crazy talented because they're all using AI, not just a few.
 
I'm concerned about the time when everyone's work looks like they're crazy talented because they're all using AI, not just a few.
certainly the bar keeps going up, and this will be another piece of that. that said, if we get a good chain-of-custody that shows where/how things have been modified, at least there's hope that you'll know it/some of it was done with AI
 
i think that it's going to be pretty transparent, but with hardware limitations. from a practical perspective, your camera is going to have a cert and it's going to mix it with the image and image metadata to provide the needed authentication signature. this should be transparent, but it will require hardware capable of doing this fingerprint fast enough to not interfere with the performance. this means that it's probably only going to be available on high end cameras that have the horsepower needed to do this until the technology can trickle down.


yes, it's going to be a standard. this is where the CAI comes in. and in general transparency only helps cryptographic integrity. most strong encryption technologies are open standards based.


if you look at how Adobe has talked about CAI, the process should apply to sites like social media companies. my understanding (and admittedly, i only glanced at new releases is the way it work) is there's basically a chain of custody, sort like:

original photo (fingerprint1) ----> lightroom made this change (fingerprint2) ----> photoshop made this change (fingerprint3)

and thus, when you look at fingerprint3, you can trace it back to photoshop modifying fingerprint2

so there is no reason it can't be:

original photo (fingerprint1) ----> lightroom made this change (fingerprint2) ----> photoshop made this change (fingerprint3) ---> fb resized (fingerprint4)

of course, this would require them to do it, but i do think it can happen. i don't know if folks remember, but fb used to strip the copyright, and they caught a lot of flack for that and now they no longer do that.

ymmv.

I don't know if I was clear about what I said regarding this being a standard, so let me try again.

The issue is that I can't see what stops someone from using software to generate and encode whatever fingerprints they want to. The only way I can think of that might get around this would be if there were some kind of hardware level encryption going on, BUT of course this couldn't apply to Lightroom/Photoshop/etc., and it would also mean that the system would only work with new cameras, meaning that for a fairly long time 90% of all cameras in use even by professionals wouldn't be able to generate the authentication.
 
certainly the bar keeps going up, and this will be another piece of that. that said, if we get a good chain-of-custody that shows where/how things have been modified, at least there's hope that you'll know it/some of it was done with AI

There's the rub, though: how many people will actually care about this? Outside of journalism, probably not many.
 
I don't know if I was clear about what I said regarding this being a standard, so let me try again.

The issue is that I can't see what stops someone from using software to generate and encode whatever fingerprints they want to. The only way I can think of that might get around this would be if there were some kind of hardware level encryption going on, BUT of course this couldn't apply to Lightroom/Photoshop/etc., and it would also mean that the system would only work with new cameras, meaning that for a fairly long time 90% of all cameras in use even by professionals wouldn't be able to generate the authentication.
they certainly could. however the chain would look like:

original photo (unknown content) ----> lightroom made this change (fingerprint1) ----> photoshop made this change (fingerprint2)

so basically anything with an "unknown" / unauthenticated piece in the chain makes the full result unauthenticated/unknown
 
if we want truth, we're going to have to fight for it
When has photography ever been “truth”? It’s a reflection of an image captured through a lens and recorded on film or a sensor. That image is subject to lens characteristics, limitations of the recording medium, location of the photographer, and perspective of the viewer, among other things. It sometimes lacks context and nuance.

Photography is art. As such it reflects reality, more or less, but it is not the reality it attempts to capture.

I have no problem with photographers using technology to record a scene as accurately as possible or to capture an image and modify it to reflect imagination, emotion, or wonder.

The spectrum of choices available to a photographer can be used for reportage, education, entertainment, or personal fulfillment. Those choices can also be used to lie, mislead, harm, or destroy.

These choices are available in any artistic field of endeavor. Artists are free to make those choices. As artists, we can choose wisely or poorly. The benefit of wise choices can bring enlightenment. The consequences of poor choices can lead to devastation.

Such is life.
 
I think of "AI" in photography in two camps: Image enhancement (lighting, sharpness, etc. and extending to Topaz' facial reconstruction feature, that removes wrinkles and helps improve appearance - but consistent with the original image), and Synthetic Photography, where things that were not there are added, subtracted, or created entirely from other content. That helps me clarify where I am comfortable and where not.

I'm cool with image enhancement, as long as the image creator is clear on what they did. Not as much with synthetic photography, where we risk mistaking an image for something that didn't exist in the original photo (if an original photo was even used!). In the right context, synthetic is probably fine, but if it's not clear it's synthetic, it's very bad.
 
AI is here to stay and can be used for good or bad purposes. Like handguns, there are those who see them as something evil intended to kill people with while others see them as tools, or to practice the hobby of hunting or target shooting.

I was watching some YouTube videos recently and there was a clip of a tractor with a flatbed that had a very heavy dozer on it. The combo was coming around a curve somewhere in a rural area towards a bridge that had been washed out. The clip was all about how the tractor then pulled the loaded trailer through the actual river, with difficulty. It is a beautiful video, pretty nice green trees and grass all around. Looked impressive. The next day there was a different clip with different tractor and flatbed navigating the exact same scene, and the day after yet another vehicle set in the same scene. That's what made me realize everything was fake.

My fear is that we'll be seeing more and more of this without the disclosure that the video is AI-generated. People with criminal intent can cause serious damage with this.
 
Last edited:
The problem I have with AI and its use in photography (especially wildlife and landscape) is that I really look for the “wow” factor in a photograph. I have no problem with “classic” photo editing efforts, including removing distracting items, playing with color and contrast, sharpening, cropping etc. I do these things all the time.

However, being a photographer, I know what goes into making a really great photograph. It is not just the final image that counts, but the photographers’ knowledge, skill, expertise, technical know-how, as well as the hardship it took to get the shot, and the photographer’s time and patience as well. I guess, when I look at a photograph and evaluate it, I not only take the photograph at face value (i.e. composition, lighting, subject matter, etc.), but I take into account all of those other things listed above (either consciously or subconsciously).

If I knew that the “photographer” just was sitting around in his or her pajamas in front of their computer and combing a bunch of different photographs, even if they are the photographer’s photos, let alone if they are using stock photos or someone else’s photographs, to make the composite, into a single “amazing” composite, it would greatly diminish my appreciation for the photograph. To my way of thinking, the AI photograph is not only different in degree but different in kind.

It may be that as AI continues to improve, it will be impossible to distinguish the two, and that to me is a problem, because I am not only interested in the result, but the effort it took to get that result. In a sense, AI diminishes the “real” photographer’s effort and allows someone that is good at photo manipulation to achieve the same result without the same level of knowledge, skill and effort. I think those things (knowledge, skill effort) have inherent value and worth, and AI diminishes that value, if not completely eliminating it from the equation.

How to resolve this dilemma, if even possible, is above my pay-grade.
 
they certainly could. however the chain would look like:

original photo (unknown content) ----> lightroom made this change (fingerprint1) ----> photoshop made this change (fingerprint2)

so basically anything with an "unknown" / unauthenticated piece in the chain makes the full result unauthenticated/unknown

Why does the original have to be unknown? If it's an open standard, what prevents someone from generating a Nikon fingerprint? Heck, even if it weren't open it wouldn't be that difficult to generate a Nikon fingerprint.

Back in the 1980s Nintendo tried to make sure that all game developers paid a licensing fee by having the Nintendo Entertainment System check for a specific hardware signature from the game cartridge. This signature was cryptographically provided by a chip that game developers had to purchase from Nintendo for each game they manufactured. It wasn't long before some developers just took a copy of the chip and reverse engineered it so they could make their own chips to convince the NES that their cartridges were legitimate.

This isn't exactly the same thing, but similar principles apply, and the game industry is a pretty good example of just how futile this sort of stuff is. Over the past 40 or so years the game industry has tried every sort of cryptography and genuinely ingenious and hardware-locked-down ways to prevent people just copying their games and in almost every case they've been beaten fairly quickly. In the case of photo authentication, it would be worse, since they wouldn't only have people looking to play a free video game working on this, but all the resources of the intelligence agencies and political powers worldwide. At a minimum I can't imagine what would be to prevent someone from taking an authentic Nikon or Canon or whatever camera and just using that to "inject" fingerprints into whatever image they want.
 
certainly the bar keeps going up, and this will be another piece of that. that said, if we get a good chain-of-custody that shows where/how things have been modified, at least there's hope that you'll know it/some of it was done with AI
Understand that, but the general population doesn't. They just like stuff because it looks cool or pretty. You can find them all over art shows and/or Facebook lapping up the fauxtography shown there.

I envision a future where sports & wildlife are shot on video and specific frames are pulled to display and print.

Landscapes, fashion and portraiture will largely be photography-based, computer-enhanced art. It's the logical conclusion to the trends that seem to be accelerating today.

You can make the argument that we're already there on several fronts.

Those who choose not to participate will be reduced to a niche group; much like large format film photographers today.

I'm ok with that, because my pleasure is in the pursuit and process of creating the image, but am frustrated by the prospect of my work not being as effective in generating a visceral response because the viewers' minds have been cauterized by an endless stream of "perfect" images.
 
Last edited:
  • Like
Reactions: seh
Perhaps I am redundant because some time ago I posted a couple of threads here about my dislike – or rather – my rejection, of AI applied to photography.
I remember that the majority of comments I received in response were critical and maintained that the advancement of technology should be accepted and applauded.
In those previous cases I made reference to several examples of how prizes have been awarded to photographs that turned out to be altered or directly created with this technology, now, with the false and altered photos of British royalty, the issue has taken on global relevance, which is why Photographers must take note.
It would be interesting to read what the defenders of AI think now, which for me is the worst enemy of photography, that is, one of the biggest traps of our times. We will never know what is true or a lie.
Kind regards to all.
Oh come on, AI opens so many possibilities. Just think back to all the livery operators who opposed cars.

IMG_1190.jpeg
You can only see EXIF info for this image if you are logged in.
 
Why does the original have to be unknown? If it's an open standard, what prevents someone from generating a Nikon fingerprint? Heck, even if it weren't open it wouldn't be that difficult to generate a Nikon fingerprint.
because they'd need a secret part of nikon's key to product an item with a nikon signature. it doesn't mean those keys won't get stolen, this kind of thing does happen, but the process should make it pretty hard to simply say your nikon and make that work

of course, you could say you're NIK0N. yah, stuff in the margins will happen.
 
Understand that, but the general population doesn't. They just like stuff because it looks cool or pretty. You can find them all over art shows and/or Facebook lapping up the fauxtography shown there.
To give a sense of how the general public thinks about photography, a brief story. I am a high school teacher gradually working on doing some photography on the side. One day a couple of months ago I was shaving and as a minor prank on the family kept a moustache on (I don't normally have one) and was inspired to try doing a "gritty" portrait, something I'd not yet tried. So, I took this photo.
NZ8_9922-Edit.jpg
You can only see EXIF info for this image if you are logged in.


Other than setting up the lighting, it's really just a black and white shot with the texture slider pushed up. I shared it on my personal social media again as part of the joke and eventually one of my students saw it because his mother is friends with my wife. He asked me to show the class, which I did.

They kept asking what filter I used. I tried to explain that it wasn't a filter but was just a black and white photo and this eventually led to me trying to explain how digital camera sensors work because they just didn't seem to be getting it. After 20 minutes I still think they really could only think of this in terms of a "filter" on Instagram or the iPhone or whatever. To so many of these kids even basic photography really is just essentially some kind of vaguely AI powered, malleable digital "painting" that is essentially just whatever you want the computer to do to it. I think most of them would be very, very, very unlikely to conceptually distinguish between a real photo and one altogether generated by AI.
 
Understand that, but the general population doesn't. They just like stuff because it looks cool or pretty. You can find them all over art shows and/or Facebook lapping up the fauxtography shown there.
absolutely. if you want a pretty photograph of a dog being pretty, ai is probably going to do that better, easier and cheaper than i can.

but if you want a picture of YOUR dog, i can give you a real picture of your dog, doing things.

if you want a picture of something that really happened, i can give you a picture of that thing.

this really happened. the owner of this dog can look at this photo and be proud of what their dog did. people at that event may want to see that photo of the dog that did that thing. people reading the news may want to see that photo of the dog that did a thing.

ai can make a photo just like it. but the owner doesn't want that photo because it's not of their dog doing that thing. conversely, people who just want a pretty photo of a dog won't care about my photo because they don't care to have a photo of a thing that happened.

 
absolutely. if you want a pretty photograph of a dog being pretty, ai is probably going to do that better, easier and cheaper than i can.

but if you want a picture of YOUR dog, i can give you a real picture of your dog, doing things.

if you want a picture of something that really happened, i can give you a picture of that thing.

this really happened. the owner of this dog can look at this photo and be proud of what their dog did. people at that event may want to see that photo of the dog that did that thing. people reading the news may want to see that photo of the dog that did a thing.

ai can make a photo just like it. but the owner doesn't want that photo because it's not of their dog doing that thing. conversely, people who just want a pretty photo of a dog won't care about my photo because they don't care to have a photo of a thing that happened.


This is why I think that wedding and event photography will, at least for a much longer time, be significantly less impacted than some other categories like wildlife and portraiture.
 
sorry, i'm thinking more about journalism, news and people's perception of things that happen.

you are correctly, photography from an artistic perspective, and art in general is a lot more abstract and fuzzy
Even in the realm of reporting and journalism, photography doesn’t always report the full story. Even so, I’m not in favor of AI in that genre, but I’m willing to consider how it might be responsibly used. I’m also aware that it will be misused, too.
 
I don't know if I was clear about what I said regarding this being a standard, so let me try again.

The issue is that I can't see what stops someone from using software to generate and encode whatever fingerprints they want to. The only way I can think of that might get around this would be if there were some kind of hardware level encryption going on, BUT of course this couldn't apply to Lightroom/Photoshop/etc., and it would also mean that the system would only work with new cameras, meaning that for a fairly long time 90% of all cameras in use even by professionals wouldn't be able to generate the authentication.
Actually, the camera companies are involved in the consortium that is developing this. They can be very specific about what takes place in the camera, changes made, and even the types of changes. But it's very new and evolving, so I would not expect too much immediately. It will take time. I could see someone like the wire services simply not accepting images without authentication that it is an original image with no alternations outside of basic exposure and cropping.
 
Actually, the camera companies are involved in the consortium that is developing this. They can be very specific about what takes place in the camera, changes made, and even the types of changes. But it's very new and evolving, so I would not expect too much immediately. It will take time. I could see someone like the wire services simply not accepting images without authentication that it is an original image with no alternations outside of basic exposure and cropping.

The point is that regardless of what information the authentication can provide, there is absolutely no way to prevent authentications from being generated and encoded into whatever images someone wants to. Within a few weeks from when the system is fully implemented, anyone who wants to learn will be able to take whatever 20 year old jpeg they want to and make it say it was just taken on a brand new Nikon Z11 or whatever. To take something generated by Adobe AI and scrub it of its Adobe generated authentication and make it read as though it's a straight out of camera shot will take a little longer, but it will likely be possible within a year. Even if the authentication is encrypted within image files in a very low level "mix" rather than being just some sort of tag that is attached, there will absolutely be tools available fairly quickly to copy the visual data from a file and use it to generate a new file with whatever authentication one wants, sort of like a more elaborate version of taking a photo of a photo.
 
The problem started with digital imaging. Twenty years ago I could use a software application to alter images in ways that could not be detected by the viewer of a print. No one complains about a noise reduction app but they are a form of image manipulation. Autofill with Photoshop is another tool that many photographers use without hesitation. Focus stacking or removing people from images by overlaying multiple shots of a scene may distort reality but it is not something I find a problem.

The core of the problem is the lack of critical thinking. A friend's mother was repeatedly scammed as she believed anything in print had to be real and so would send money off to organizations without any thought. Digital images and for that matter even analog photographs are inherently divorced from reality as they capture only what the photographer chose to capture and only for that instant in time. We see that now with the media censoring images of dead or dying children in Gaza so as not to upset their audience or cause problems for the governments involved.
 
Back
Top