Apple Intelligence, in all its fullness, is emerging before our eyes, taking a big leap forward this week with the introduction of iOS 18.2. Across the iPhone landscape (at least those that can support Apple’s brand of artificial intelligence), millions are experiencing their iPhone’s ability to transform them into cartoon-like characters and, through Image Playground, create fantastic mashups and oddball scenarios.
I’m not immune to this allure. I used Image Playground to transform me into a wizard I can share with my unimpressed wife, who told me, “I don’t need this in my life.” Later, I created a tiny wizard, Genmoji Lance, that I can use as a message emoji. Perhaps she’ll like that better.
Sometimes, Apple’s commitment to its Apple Intelligence efforts appears almost tentative. For instance, find the Genmoji beta. It’s hidden in Messages under first the emoji and then an even tinier version of that icon with a little “+” next to the “Describe an emoji” field. It’s a fun tool; why hide its light under a bushel?
After spending way too much time in Image Playground and with Genmoijis, I turned my attention to Siri and remained unimpressed.
Apple’s late entry into the Generative AI space means there’s more pressure for Apple to bring something truly useful to the space, and, if I’m being honest, Genmoji and Image Playground aren’t it. They’re fun, but what excited me most about Apple Intelligence when Apple first introduced it at WWDC 2024 was how your iPhone (as well as the Mac and iPad) might become more self-aware.
In a release on Apple Intelligence, Apple promised that Siri would be “able to take hundreds of new actions in and across Apple and third-party Apps.” Apple claimed that I’d be able to ask Siri to “’Send the photos from the barbecue on Saturday to Malia,’ and Siri will take care of it.”
To be fair, Apple has noted that Apple Intelligence would roll out through late 2024 and into 2025. It’s an incredibly slow schedule in a space where competitors and partners like OpenAI are releasing massive updates almost daily (see 12 Days of OpenAI).
On the other hand, the lack of clarity about exactly what’s enabled on our current version of Siri is frustrating. Siri now looks completely different and is lovely, but it mostly equates to a facelift where the underlying bone structure is essentially the same.
Siri still struggles to carry on conversations, and when I tried to replicate the photo request, asking Siri to send photos from a recent holiday party to my wife, Siri simply told me, “I can only send screenshots from here.” That’s a long way from being system-aware and truly helpful. At the very least, I’d expect Siri to guess which recent photos I was talking about.
In photos, when I asked Siri to “Open screenshot” because I now struggle to find that album in the redesigned Photos, Siri took a screenshot of the page. Thanks, Siri, for another screenshot I’ll struggle to find later.
There are many things Siri can now do with the system. I can switch to Dark Mode simply by asking Siri. The digital assistant is better at navigating my mumbles. I can open Home through Siri, but Siri can’t help me solve my Home woes, which includes looking at my home network to see if there are other smart devices that are not part of the Home system.
Siri is still clever. I asked it what it was doing today, and it told me, “I’m pondering eternity, it’s taking forever.” But when I asked what that meant, Siri had no response – conversation is still not Siri’s strong suit.
Similarly, third-party app integration and on-screen action awareness aren’t really a thing yet. Siri can open Threads, for example, but when I asked it to compose a new post, Siri got stuck in a “to who?” loop. Since Threads is a public social media platform, the answer would be “to Everyone,” I got nowhere with this request.
Some of Apple’s most powerful AI tricks aren’t even its own. iOS 18.2 brought a Visual Intelligence update to Camera Control (that new “button” on the side of your iPhone). You hard press Camera Control to launch a special image capture window. Press the shutter button, and you’re presented with two options: Ask or Search.
If you choose Ask, your query is sent to OpenAI’s ChatGPT, and if you choose Search it goes, yes, to Google. Both of these work well, but where’s Apple Intelligence in this picture? Where’s Siri? Apple can’t really claim the AI work of others as its own, right?
I understand that Apple Intelligence is a work in progress, but the shinest object for any generative AI platform is still the chatbot. The voice assistant on platforms like Gemini and ChatGPT can carry on lengthy conversations, see what you’re seeing, understand the context, and take action.
Even after the iOS 18.2 update, Siri is miles away from that. I know how careful Apple likes to be with these things, but if it continues checking the engine before mashing the gas pedal, the rest of the AI race cars will leave it far behind. You can’t win a race when it’s over and won.
You might also like
Services Marketplace – Listings, Bookings & Reviews