We’re running out of ways to tell you that Google is releasing more generative AI features, but that’s what’s happening in Google Photos today. The Big G is finally making good on its promise to add its market-leading Nano Banana image-editing model to the app. The model powers a couple of features, and it’s not just for Google’s Android platform. Nano Banana edits are also coming to the iOS version of the app.
Nano Banana started making waves when it appeared earlier this year as an unbranded demo. You simply feed the model an image and tell it what edits you want to see. Google said Nano Banana was destined for the Photos app back in October, but it’s only now beginning the rollout. The Photos app already had conversational editing in the “Help Me Edit” feature, but it was running an older non-fruit model that produced inferior results. Nano Banana editing will produce AI slop, yes, but it’s better slop.
Nano Banana in Help me edit
Google says the updated Help Me Edit feature has access to your private face groups, so you can use names in your instructions. For example, you could type “Remove Riley’s sunglasses,” and Nano Banana will identify Riley in the photo (assuming you have a person of that name saved) and make the edit without further instructions. You can also ask for more fantastical edits in Help Me Edit, changing the style of the image from top to bottom.
Enlarge / A cropped image showing Raw TV’s poster for the Netflix documentary What Jennifer Did, which features a long front tooth that leads critics to believe it was AI-generated.
An executive producer of the Netflix hit What Jennifer Did has responded to accusations that the true crime documentary used AI images when depicting Jennifer Pan, a woman currently imprisoned in Canada for orchestrating a murder-for-hire scheme targeting her parents.
What Jennifer Did shot to the top spot in Netflix’s global top 10 when it debuted in early April, attracting swarms of true crime fans who wanted to know more about why Pan paid hitmen $10,000 to murder her parents. But quickly the documentary became a source of controversy, as fans started noticing glaring flaws in images used in the movie, from weirdly mismatched earrings to her nose appearing to lack nostrils, the Daily Mail reported, in a post showing a plethora of examples of images from the film.
Futurism was among the first to point out that these flawed images (around the 28-minute mark of the documentary) “have all the hallmarks of an AI-generated photo, down to mangled hands and fingers, misshapen facial features, morphed objects in the background, and a far-too-long front tooth.” The image with the long front tooth was even used in Netflix’s poster for the movie.
Because the movie’s credits do not mention any uses of AI, critics called out the documentary filmmakers for potentially embellishing a movie that’s supposed to be based on real-life events.
But Jeremy Grimaldi—who is also the crime reporter who wrote a book on the case and provided the documentary with research and police footage—told the Toronto Star that the images were not AI-generated.
Grimaldi confirmed that all images of Pan used in the movie were real photos. He said that some of the images were edited, though, not to blur the lines between truth and fiction, but to protect the identity of the source of the images.
“Any filmmaker will use different tools, like Photoshop, in films,” Grimaldi told The Star. “The photos of Jennifer are real photos of her. The foreground is exactly her. The background has been anonymized to protect the source.”
While Grimaldi’s comments provide some assurance that the photos are edited versions of real photos of Pan, they are also vague enough to obscure whether AI was among the “different tools” used to edit the photos.
One photographer, Joe Foley, wrote in a post for Creative Bloq that he thought “documentary makers may have attempted to enhance old low-resolution images using AI-powered upscaling or photo restoration software to try to make them look clearer on a TV screen.”
“The problem is that even the best AI software can only take a poor-quality image so far, and such programs tend to over sharpen certain lines, resulting in strange artifacts,” Foley said.
Foley suggested that Netflix should have “at the very least” clarified that images had been altered “to avoid this kind of backlash,” noting that “any kind of manipulation of photos in a documentary is controversial because the whole point is to present things as they were.”