Apple's long-promised AI-powered photo editing suite is finally landing on every iPhone running iOS 18.3 or later — and it's smarter than most people expected. The feature, internally called "ML Compose," uses on-device machine learning to let you select subjects, remove objects, re-light scenes, and even generate missing background elements — all without sending a single pixel to Apple's servers. Privacy-first AI is the pitch, and for once, it actually delivers.
What It Actually Does
Let's be specific, because Apple has a history of vague AI branding. ML Compose is a suite of five tools: Clean Up (removes objects), Relight (adjusts lighting direction and intensity), Depth Edit (adjusts blur after the photo is taken), Style Transfer (applies a consistent aesthetic across a series of photos), and Inpaint (fills in missing or cropped areas with AI-generated content).
Each tool works locally using Apple's Neural Engine — the chip inside your iPhone that handles machine learning tasks without touching the cloud. That's meaningful because it means your photos never leave your phone. Google and Samsung have offered similar features, but Apple's emphasis on on-device processing is a genuine differentiator for privacy-conscious users.
The Experience
Using Clean Up feels almost too easy. Open a photo, tap the edit button, select Clean Up, and humanoid robot developmentAI tools in educationtap on whatever you want removed — a photobomber, a stray trash can, a power line. The AI erases it and fills the space with something plausible. In testing, it handled complex backgrounds like tree foliage better than Google's Magic Eraser, though it occasionally struggled with reflective surfaces like windows or water.
Relight is more subtle. You can take a photo where someone's face is underexposed and relight it as if the light source was coming from a different direction. It's not dramatic — the results are natural rather than dramatic — but for anyone who's ever taken a photo indoors and wished the light hit differently, it's genuinely useful.
Why This Matters for Google and Samsung
Apple was late to AI photo editing. Google's Magic Eraser has been available since 2021, and Samsung's Object Eraser launched around the same time. But Apple's approach — heavy on-device processing, no cloud dependency, no data collection — gives it an edge with the segment of users who've grown suspicious of AI features that require internet connectivity or cloud processing.
More importantly, ML Compose signals that Apple is serious about embedding AI into everyday workflows, not just adding chatbot features. Photo editing is something virtually every smartphone user does. Making it meaningfully better — and private — is the kind of AI integration that actually changes behavior, not just generates headlines.
The Catch
ML Compose requires a fairly recent chip — iPhone 12 and newer for the full feature set, with iPhone 11 getting a limited version. If you have an older phone, you'll still get some features through iCloud Photos processing, but the on-device experience degrades quickly. So if you've been putting off an upgrade, this might finally be the feature that makes the upgrade feel worth it.
The rollout started this week and will reach all eligible devices by end of month. Check Settings → Photos → ML Compose to see if it's available on your device yet.