There are telltale signs that an image is generated by AI
Generative AI is getting better and better. Just this week, Adobe released Firefly 2, a huge improvement over its previous AI model, especially when it comes to facial representation.
This makes it more difficult than ever to know if an image is generated by AI. More difficult, but not impossible. Here are some things to look for if you’re trying to determine whether an image is created by AI or not.
Check image metadata
The most obvious telltale sign that an image is AI-generated is one you can’t see just by looking at the image: metadata.
Metadata is information attached to an image file that tells you details such as the camera used to take a photo, the image resolution, and any copyright information. Metadata also often betrays whether an image was created by AI.
Metadata often survives when an image is uploaded to the Internet. So if you re-upload the image and inspect the metadata, you can normally reveal the source of an image.
To view image metadata on Windows:
- Right-click on the image file and select Properties
- Click the Details tab in the window that opens
To view metadata on a Mac:
- Right-click the image file
- Select Get Info
In genuine photos, you should find details such as camera make and model, focal length and exposure time. On AI-generated images, this information will be absent.
To be clear, the absence of metadata does not necessarily mean that an image is AI-generated. Some photographers and websites remove metadata before sharing images. But if an image contains such information, you can be 99% sure that it is not AI-generated. Or, at least, not entirely AI-generated.
The name of the image file is another important clue. Images downloaded from Adobe Firefly will start with the word Firefly, for example. AI-generated images from Midjourney include the creator’s username and image prompt in the file name. Again, file names are easy to change, so it’s not a foolproof way to determine whether it’s the work of the AI or not.
Funny faces
Faces in a crowd are difficult for AI
Although generative AI is getting a lot better for faces, it’s still a problem, especially when you have a lot of faces in a single image.
At first glance, the above image of a football crowd looks like a real photo. However, even a relatively quick glance around the crowd reveals strangely distorted faces, like the ones highlighted below:
Distorted faces are easy to spot upon closer inspection
This same rule applies to AI-generated images that resemble paintings, sketches, or other forms of art: mutilated faces in a crowd are a telltale sign of AI involvement.
AI Artifacts
At first glance, the image seems convincing, but…
It’s not just faces that are often wrong in AI images, but other fine details as well. The woman’s face in the image above is actually quite convincing and, again, at first glance you might think it’s a real photo. But zoom in and you’ll see other details went wrong.
Text is often poorly rendered by AI
The text in books in the background is just a blurry mush, for example. Yes, this was designed to look like a photo with a shallow depth of field, but the text in these blue books should still be readable.
Look closely at the woman’s wrist: the bracelet/watch strap she is wearing is also deformed. It’s often when you zoom in closely and start inspecting the details that the AI’s involvement becomes evident.
Watermark traces
AI models are often trained on huge libraries of images, many of which are watermarked by photo agencies or photographers. Unlike us, AI models cannot easily distinguish a watermark from the main image. So when you ask an AI service to generate an image of, say, a sports car, it might put what looks like a truncated watermark on the image because it thinks that’s what should be there.
Fake watermarks normally appear at the bottom right of images. They are almost always unreadable, which a real watermark almost never would be!