In iOS 18’s latest developer preview, Siri gets a glow-up. Like, the whole phone actually glows around the edges when you invoke Siri.

A splash screen reintroduces you to the virtual assistant once you enable Apple Intelligence, an early version of which is now available on the iPhone 15 Pro and Pro Max in a developer beta. You’ll know Siri is listening when the edges of the screen glow, making it pretty obvious that something different is going on.

The big Siri AI update is still months away. This version comes with meaningful improvements to language understanding, but future updates will add features like awareness of what’s on your screen and the ability to take action on your behalf. Meanwhile, the rest of the Apple Intelligence feature set previewed in this update feels like a party waiting for the guest of honor.

That said, Siri’s improvements in this update are useful. Tapping the bottom of the screen twice will bring up a new way to interact with the assistant: through text. It’s also much better at parsing natural language, waiting more patiently through hesitations and “um”s as I stumble through questions. It also understands when I’m asking a follow-up question.

a:hover]:text-black [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-e9 dark:[&>a:hover]:shadow-underline-gray-63 [&>a]:shadow-underline-gray-13 dark:[&>a]:shadow-underline-gray-63″>Double-tapping the bottom of the screen brings up a text box you can use to talk to Siri.

a:hover]:text-black [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-e9 dark:[&>a:hover]:shadow-underline-gray-63 [&>a]:shadow-underline-gray-13 dark:[&>a]:shadow-underline-gray-63″>New Siri understands context in follow-up questions, like this one after I asked for the weather in Olympia.

Outside of Siri, it’s kind of an Easter egg hunt finding bits of Apple Intelligence sprinkled throughout the OS. They’re in the mail app, with a summarize button at the top of each email now. And anywhere you can type and highlight text, you’ll find a new option called “writing tools” with AI proofreading, writing suggestions, and summaries.

“Help me write something” is pretty standard fare for generative AI these days, and Apple Intelligence does it as well as anyone else. You can have it make your text more friendly, professional, or concise. You can also create summaries of text or synthesize it into bulleted lists of key points or a table.

I’m finding these tools most useful in the Notes app, where you can now add voice recordings. In iOS 18, voice recordings finally come with automatic transcriptions, which is not an Apple Intelligence feature since it also works on my iPhone 13 Mini. But Apple Intelligence will let you turn a recording transcript into a summary or a checklist. This is helpful if you want to just free-associate while recording a memo and list a bunch of things you need to pack for an upcoming trip; Apple Intelligence turns it into a list that actually makes sense.

a:hover]:text-black [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-e9 dark:[&>a:hover]:shadow-underline-gray-63 [&>a]:shadow-underline-gray-13 dark:[&>a]:shadow-underline-gray-63″>Honestly, this transcript is pretty good.

a:hover]:text-black [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-e9 dark:[&>a:hover]:shadow-underline-gray-63 [&>a]:shadow-underline-gray-13 dark:[&>a]:shadow-underline-gray-63″>Apple Intelligence turned my rambling list into a neat little table.

These writing tools are tucked out of the way, and if you weren’t looking for them, you might miss them entirely. The more obvious new AI features are in the mail app. Apple Intelligence surfaces what it deems to be important emails in a card that sits above the rest of your inbox marked as priority. Below that, emails show a brief summary in place of the first line or two of text that you’d normally see.

There’s something charming about AI’s sincere attempt to summarize promotional emails, trying to helpfully pull out bits of detail like “Backpacks and lunch boxes ship FREE” and “Organic white nectarines are sweet and juicy, in season now.” But the descriptions in my inbox were accurate — helpful in a few instances and harmless at worst. And the emails it gave priority status to were genuinely important, which is promising.

The search tool in the Photos app now uses AI to understand more complicated requests. You can ask for pictures of a particular person wearing glasses or all the food you ate in Iceland, all in natural language.

Despite the light show, Siri is about the same as ever

It’s very good. Results come back fast and are generally reliable. It found the photo I had in mind of my kid wearing a pair of goofy glasses, though it also surfaced photos in which he appeared and someone else was wearing some glasses. Still, I think it’s bound to be a feature that people immediately get used to and don’t think twice about using — intuitive and obviously useful.

But despite the light show, Siri is about the same as ever. It mostly remains a “let me Google that for you” machine. The most significant updates are still to come in future updates when Siri will gain awareness of what’s on your screen and will be able to take action in apps for you. In theory, you’ll be able to have Siri grab information from messages and turn them into calendar events or retrieve information from email without you having to go digging through your inbox.

That’s the stuff I’m most excited about, and all of the pieces of Apple Intelligence available so far could be the building blocks of a better Siri. Apple’s AI is capable of understanding the contents of an email or a photo. Likewise, Siri is better at understanding how humans talk. In order for Apple Intelligence to prove itself, Siri needs to connect the dots.

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums

Leave a Reply