Several photographers have shared examples over the past few months, with Meta recently marking a photo former White House photographer Pete Souza took of a basketball game as AI-generated. In another recent example, Meta incorrectly added the label to an Instagram photo of the Kolkata Knight Riders winning the Indian Premier League Cricket tournament. Interestingly, like Souza’s photo, the label only shows up when viewing the images on mobile, not on the web.
Souza says he tried to uncheck the label but was unable to. He theorizes that using Adobe’s cropping tool and flattening images before saving them as JPEG images may be triggering Meta’s algorithm.
However, Meta has also incorrectly marked real photos as AI when photographers use generative AI tools like Adobe’s Generative Fill to remove even the smallest of objects, PetaPixel reports. The publication tested this out for itself using Photoshop’s Generative Fill tool to remove a speck from an image, which Meta then marked as AI-generated on Instagram. Strangely, though, Meta didn’t add the “Made with AI” label when PetaPixel uploaded the file back into Photoshop and then saved it after copying and pasting it into a black document.
Multiple photographers have voiced their frustrations that such minor edits are unfairly being labeled as AI-generated.
“If ‘retouched’ photos are ‘Made with AI’ then that term effectively has no meaning,” photographer Noah Kalina wrote on Threads. “They might as well auto tag every photograph ‘Not a True Representation of Reality’ if they are serious about protecting people.”
In a statement to The Verge, Meta spokesperson Kate McLaughlin said that the company is aware of the issue and is evaluating its approach “so that [its] labels reflect the amount of AI used in an image.”
“We rely on industry standard indicators that other companies include in content from their tools, so we’re actively working with these companies to improve the process so our labeling approach matches our intent,” added McLaughlin.
In February, Meta announced it would start adding “Made with AI” labels to photos uploaded across Facebook, Instagram, and Threads ahead of this year’s election season. Specifically, the company said it would add the label to AI-generated photos made with tools from Google, OpenAI, Microsoft, Adobe, Midjourney, and Shutterstock.
Meta hasn’t disclosed what exactly triggers the “Made with AI” label, but all of these companies have — or are working on — adding metadata to image files to signify the use of AI tools, which is one way Meta identifies AI-generated photos. Adobe, for example, started adding information about a content’s origins into the metadata with the release of its Content Credentials system last year.
Services Marketplace – Listings, Bookings & Reviews