We’ve barely had it for two weeks, and most of the Apple Intelligence features released so far on iOS 18 and MacOS Sequoia are focused on text — such as generating summaries for notifications — and email, helping to correct your writing and delivering smarter Siri responses. But one feature uses generative AI to work with images: the new Clean Up tool in the Photos app.
Clean Up analyzes an image, suggests items you’d likely want removed, such as people or vehicles in the background, and then fills in the deleted area. Sometimes the fix can be invisible to most viewers — and sometimes the results are laughably poor. After running many types of photos through the tool, I’ve come up with a few general guidelines to help you get the best cleaned-up images.
Surprisingly, Photos on the iPhone and iPad has never had a tool like Clean Up for removing small distractions. The Mac version does include a basic Retouch tool that can repair some areas, which is supplanted by Clean Up on compatible Macs.
It’s important to remember that Clean Up is a feature of Apple Intelligence, so you will see it only if you’re running a compatible device and you’ve been granted access to the Apple Intelligence beta. That includes iPhones running iOS 18.1, iPads with M-series processors (and the iPad mini with the A17 Pro chip) in iPadOS 18.1 and Macs with M-series processors in MacOS Sequoia 15.1.
For more on Apple Intelligence, see which features I think you’ll use the most and where its notifications need improvement.
How is Clean Up different from other retouching tools?
The repair tools in most photo editing apps work by copying nearby or similar pixels to fill in the space where you’re making a fix. They’re great for removing lens flares or dust spots against a sky, for example.
The Clean Up tool uses generative AI, which analyzes the entire scene and makes guesses about what should fill the area you’ve selected. If you want to remove a dog standing in front of a tree, for example, generative AI creates a replacement based on what it knows about tree texture and foliage in the background, and also takes into account the lighting level and direction in the photo.
The “generative” part of generative AI comes from the way it creates the image. The pixels that fill the area literally come from nothing: The software starts with a random pattern of dots and iterates quickly to create what it determines would appear in the same space.
Keep in mind, retouching tools that use generative AI are the ultimate YMMV, or “your mileage may vary.” I’ve gotten good results in difficult compositions and terrible results in areas I thought would be simple for the app to handle.
Watch this: Apple Intelligence Impressions: Don’t Expect Radical Change
How to remove distractions using Apple’s Clean Up tool
The Clean Up tool takes two approaches to repairing photos. Using machine learning, it suggests items such as people or vehicles in the background as possible items to remove. Or, you can drag over what you want to remove and direct Photos to work on that area. The process breaks down like this:
1. Open a photo and tap the Edit button. (On MacOS, click the button labeled Edit, or press the Return key.)
2. Tap Clean Up. The first time you use the tool, Photos needs to download Clean Up resources, which will take a minute or so depending on your Internet connection. Photos analyzes the image and highlights any potential items to be removed with a translucent shimmer.
3. To remove a suggested item, tap it. Or, draw a circle around any item that isn’t glowing.
4. Don’t be surprised if the area isn’t cleaned fully on the first attempt — you may need to draw over remaining areas to do more removal. If you’re not happy with a fix, tap the Undo button.
5. When finished, tap Done. As with all edits in Photos, you can revert back to the original if you want to start over: Tap the More (…) button and choose Revert to Original.
An unexpected and cool feature: Safety Filter
Primarily you’ll use the Clean Up tool to get rid of distracting elements in a scene, but it has another trick available: You can hide the identity of someone in the photo.
Draw a circle around their face. You don’t have to fill it in — a general swipe will do the job. Photos applies a blocky mosaic pattern in place of the person’s face to obscure it.
Where you’ll see the most success with Clean Up
Some scenes and areas work better with Clean Up, so it’s good to know where to focus your efforts.
In my testing, I’ve found the most success in these general categories of fixes:
- Small distractions. Items such as litter on the ground or dust and threads on people’s clothing consistently turn out well.
- Background textures. Areas such as tree leaves, grass or stone can be replicated well.
- Lens flare. As long as it’s not too large, lens flare caused by light bouncing between camera lens elements
- Bystanders or vehicles in the background that don’t occupy much area.
- Areas with sparse detail or background.
In general, when dragging around an area, make sure you grab reflections or shadows cast by the item you want removed. Fortunately Photos often picks up on those and will include them in its selection.
Areas to avoid when trying to use Clean Up
Some Clean Up targets are going to frustrate you when you try to remove them. For example:
- Very large areas. If it’s too big, Photos balks and tells you to mark a smaller area or it makes a mess of the area. It is also inconsistent about coming up with what would plausibly appear in such a large space.
- Busy areas with clearly defined features. Tree leaves in the distance generally work well, but not so when there are recognizable structures or items. Removing a prominent leaf from a pile of leaves, or clearing out people from recognizable landmarks, for instance, doesn’t turn out well.
Where Clean Up needs more work
Remember, Clean Up and the other Apple Intelligence features are still technically in beta, even though they’re available to anyone with a compatible device who signs up for the beta program. (I have some thoughts about installing beta software in general.)
And while you can get some good results, there are still a few areas I’m looking forward to Apple improving in future releases. Namely, the quality of the replaced areas is spotty, sometimes looking more like non-AI repair tools. I would have expected Apple’s algorithms to do a better job of determining what’s in a scene and building replacement areas.
In terms of the user experience, if you don’t like what Clean Up offers for a removal, your only options are to undo or reset the edit. And if you undo, then try again, you get the same already-processed results. Adobe Lightroom, by contrast, offers three possibilities for every fix, with the option to generate another set if you don’t like what it came up with.
Clean Up — and other similar AI-based removal tools — also suffer from their projected expectations: We’ve seen where it can do great things, which raises the bar for what we think every edit should do. When the tool gets confused and serves up a mess of disparate pixels, we expect it to do better. Maybe in the next releases.
For more on what Apple Intelligence brings to your Apple devices, get a peek at the visual intelligence feature.
+ There are no comments
Add yours