Photo Fix Strategies Using the Apple Intelligence Clean Up Tool

Estimated read time 8 min read


We’ve barely had it for two weeks, and most of the Apple Intelligence features released so far on iOS 18 and MacOS Sequoia are focused on text — such as generating summaries for notifications — and email, helping to correct your writing and delivering smarter Siri responses. But one feature uses generative AI to work with images: the new Clean Up tool in the Photos app.

Clean Up analyzes an image, suggests items you’d likely want removed, such as people or vehicles in the background, and then fills in the deleted area. Sometimes the fix can be invisible to most viewers — and sometimes the results are laughably poor. After running many types of photos through the tool, I’ve come up with a few general guidelines to help you get the best cleaned-up images.

Two photos of a brick building along an uphill street. In the first, a series of vertical traffic posts are distracting. In the second, the posts have been removed. Two photos of a brick building along an uphill street. In the first, a series of vertical traffic posts are distracting. In the second, the posts have been removed.

The Clean Up tool can remove distractions.

Jeff Carlson/CNET

Surprisingly, Photos on the iPhone and iPad has never had a tool like Clean Up for removing small distractions. The Mac version does include a basic Retouch tool that can repair some areas, which is supplanted by Clean Up on compatible Macs.

It’s important to remember that Clean Up is a feature of Apple Intelligence, so you will see it only if you’re running a compatible device and you’ve been granted access to the Apple Intelligence beta. That includes iPhones running iOS 18.1, iPads with M-series processors (and the iPad mini with the A17 Pro chip) in iPadOS 18.1 and Macs with M-series processors in MacOS Sequoia 15.1.

For more on Apple Intelligence, see which features I think you’ll use the most and where its notifications need improvement.

How is Clean Up different from other retouching tools?

The repair tools in most photo editing apps work by copying nearby or similar pixels to fill in the space where you’re making a fix. They’re great for removing lens flares or dust spots against a sky, for example.

The Clean Up tool uses generative AI, which analyzes the entire scene and makes guesses about what should fill the area you’ve selected. If you want to remove a dog standing in front of a tree, for example, generative AI creates a replacement based on what it knows about tree texture and foliage in the background, and also takes into account the lighting level and direction in the photo.

CNET Tips_Tech CNET Tips_Tech

The “generative” part of generative AI comes from the way it creates the image. The pixels that fill the area literally come from nothing: The software starts with a random pattern of dots and iterates quickly to create what it determines would appear in the same space.

Keep in mind, retouching tools that use generative AI are the ultimate YMMV, or “your mileage may vary.” I’ve gotten good results in difficult compositions and terrible results in areas I thought would be simple for the app to handle.

Watch this: Apple Intelligence Impressions: Don’t Expect Radical Change

How to remove distractions using Apple’s Clean Up tool

The Clean Up tool takes two approaches to repairing photos. Using machine learning, it suggests items such as people or vehicles in the background as possible items to remove. Or, you can drag over what you want to remove and direct Photos to work on that area. The process breaks down like this:

1. Open a photo and tap the Edit button. (On MacOS, click the button labeled Edit, or press the Return key.)

2. Tap Clean Up. The first time you use the tool, Photos needs to download Clean Up resources, which will take a minute or so depending on your Internet connection. Photos analyzes the image and highlights any potential items to be removed with a translucent shimmer.

Two iPhone screenshots of a bearded man taking a selfie. In the background are pedestrians and cars. The figure at right shows the Photos app Clean Up interface with arrows marking highlighted items. Two iPhone screenshots of a bearded man taking a selfie. In the background are pedestrians and cars. The figure at right shows the Photos app Clean Up interface with arrows marking highlighted items.

Open the Photos edit interface and tap Clean Up. Photos makes suggestions about what to remove.

Screenshot by Jeff Carlson/CNET

3. To remove a suggested item, tap it. Or, draw a circle around any item that isn’t glowing.

4. Don’t be surprised if the area isn’t cleaned fully on the first attempt — you may need to draw over remaining areas to do more removal. If you’re not happy with a fix, tap the Undo button.

Close up of removing people in the background behind a man taking a selfie. In the image at left the people are highlighted except one person's legs. At right, the same image with a selection made to clean up the legs. Close up of removing people in the background behind a man taking a selfie. In the image at left the people are highlighted except one person's legs. At right, the same image with a selection made to clean up the legs.

If Clean Up doesn’t snag everything — note that the person’s legs in the image on the left are not highlighted — use the tool again to keep cleaning the area.

Screenshot by Jeff Carlson/CNET

5. When finished, tap Done. As with all edits in Photos, you can revert back to the original if you want to start over: Tap the More (…) button and choose Revert to Original.

An unexpected and cool feature: Safety Filter

Primarily you’ll use the Clean Up tool to get rid of distracting elements in a scene, but it has another trick available: You can hide the identity of someone in the photo.

Draw a circle around their face. You don’t have to fill it in — a general swipe will do the job. Photos applies a blocky mosaic pattern in place of the person’s face to obscure it.

Two photos of a man taking a selfie. At left is a circular selection around his face. At right, the face is replaced by a mosaic grid. Two photos of a man taking a selfie. At left is a circular selection around his face. At right, the face is replaced by a mosaic grid.

The Safety Filter is a clever use of the Clean Up tool.

Screenshot by Jeff Carlson/CNET

Where you’ll see the most success with Clean Up

Some scenes and areas work better with Clean Up, so it’s good to know where to focus your efforts. 

In my testing, I’ve found the most success in these general categories of fixes:

  • Small distractions. Items such as litter on the ground or dust and threads on people’s clothing consistently turn out well.
  • Background textures. Areas such as tree leaves, grass or stone can be replicated well.
  • Lens flare. As long as it’s not too large, lens flare caused by light bouncing between camera lens elements
  • Bystanders or vehicles in the background that don’t occupy much area.
  • Areas with sparse detail or background.

Examples of Clean Up at work: removing a lens flare from a photo of a ship at port at sunset; painting out a bag next to two people sitting on giant pumpkins; removing an out of focus dog in the background behind a flower closeup. Examples of Clean Up at work: removing a lens flare from a photo of a ship at port at sunset; painting out a bag next to two people sitting on giant pumpkins; removing an out of focus dog in the background behind a flower closeup.

Sometimes Clean Up works well — originals on top, edited versions on the bottom.

Jeff Carlson/CNET

In general, when dragging around an area, make sure you grab reflections or shadows cast by the item you want removed. Fortunately Photos often picks up on those and will include them in its selection.

Three iPhone screens using Clean Up in the Photos app. A couple are taking a photo in front of a rainbow-painted rock wall in a Stockholm subway station. They're highlighted; the software makes its selection; they're removed. Three iPhone screens using Clean Up in the Photos app. A couple are taking a photo in front of a rainbow-painted rock wall in a Stockholm subway station. They're highlighted; the software makes its selection; they're removed.

Be sure to select shadows and reflections (left). Clean Up detects what should be removed based on the broad selection (middle). A little reflection is left over (right), but that can be cleaned with one more swipe of the tool.

Screenshot by Jeff Carlson/CNET

Areas to avoid when trying to use Clean Up

Some Clean Up targets are going to frustrate you when you try to remove them. For example:

  • Very large areas. If it’s too big, Photos balks and tells you to mark a smaller area or it makes a mess of the area. It is also inconsistent about coming up with what would plausibly appear in such a large space.
  • Busy areas with clearly defined features. Tree leaves in the distance generally work well, but not so when there are recognizable structures or items. Removing a prominent leaf from a pile of leaves, or clearing out people from recognizable landmarks, for instance, doesn’t turn out well.

Two photos of a woman and child at an outdoor market. The child is sitting next to an orange traffic cone. The woman is standing facing away from the camera. In the image at right, attempting to remove the woman has resulted in a visual mess. Two photos of a woman and child at an outdoor market. The child is sitting next to an orange traffic cone. The woman is standing facing away from the camera. In the image at right, attempting to remove the woman has resulted in a visual mess.

Removing large objects in the frame becomes a jumble of pixels.

Screenshot by Jeff Carlson/CNET

Where Clean Up needs more work

Remember, Clean Up and the other Apple Intelligence features are still technically in beta, even though they’re available to anyone with a compatible device who signs up for the beta program. (I have some thoughts about installing beta software in general.)

And while you can get some good results, there are still a few areas I’m looking forward to Apple improving in future releases. Namely, the quality of the replaced areas is spotty, sometimes looking more like non-AI repair tools. I would have expected Apple’s algorithms to do a better job of determining what’s in a scene and building replacement areas.

In terms of the user experience, if you don’t like what Clean Up offers for a removal, your only options are to undo or reset the edit. And if you undo, then try again, you get the same already-processed results. Adobe Lightroom, by contrast, offers three possibilities for every fix, with the option to generate another set if you don’t like what it came up with.

Three screenshots of Lightroom removing a bag sitting next to a giant pumpkin. Each screen shows a different replacement option. Three screenshots of Lightroom removing a bag sitting next to a giant pumpkin. Each screen shows a different replacement option.

Lightroom (iPhone app shown here) gives you three options for a removed area.

Screenshot by Jeff Carlson/CNET

Clean Up — and other similar AI-based removal tools — also suffer from their projected expectations: We’ve seen where it can do great things, which raises the bar for what we think every edit should do. When the tool gets confused and serves up a mess of disparate pixels, we expect it to do better. Maybe in the next releases.

For more on what Apple Intelligence brings to your Apple devices, get a peek at the visual intelligence feature.

Check Out the iPhone 16 Pro Max’s Cameras, Display and Colors

See all photos





Source link

You May Also Like

More From Author

+ There are no comments

Add yours