The Apple Vision Pro reveals the wearer's eyes on a front-facing display.
Apple

Digitally crafted alternative realities can be exciting or discomforting, based on how you envision them. But Apple, among other companies invested in AR- and VR-dominant future, clearly wants you to focus on the bright side. That’s why Apple spent a substantial chunk of time at its WWDC 2023 developer conference to highlight the various features of the Apple Vision Pro — Apple’s extravagant new mixed reality headset.

As per Apple’s surefooted narrative at the event, the Vision Pro delivers us into a new era in computing and pushes beyond the boundaries of fixed displays. The promotional walkthroughs easily convince us the headset is both visually and functionally unique in many ways.

From allowing you to expand your workflow with countless screens that spread out in the ambient space to immersing yourself in hyperrealistic nature gateways, the headset sports many fascinating features. And we certainly hope a few of its features make their way to other devices, especially the iPhone.

Apple’s new FaceTime avatars look incredible

Someone scanning their face with Apple Vision Pro.
Apple

The Vision Pro is proclaimed to blend real and imaginative scenarios seamlessly, which can persuade the human mind to actually forget boundaries between the physical and virtual realms. This is achieved through virtual elements that do not plainly float around in your physical space but also enhance your physical surroundings. One of the ways is by having minute details, such as shadows from those floating windows, cast over real-world objects in your surroundings.

Another way Apple looks to amaze (as well as perplex) non-users is by using virtual avatars in FaceTime. With the headset, Apple is hailing the new era of “spatial FaceTime,” where videos of each participant will appear in windows that lace real environments of the Vision Pro users. But the people on the other side will not see your face wrapped in the headgear.

The Vision Pro will instead create artfully realistic — and, at the same time, weirdly unnerving — virtual representations of the user. These avatars show in place of the actual view and are created with the two TrueDepth cameras on the headset. Notably, the TrueDepth camera is a unique camera system originally introduced on the iPhone X. It comprises an infrared flood illuminator, a dot projector, and an infrared camera. All three of these systems combine to help create a 3D map of the user’s face which can be used to unlock the iPhone using Face ID.

Virtual persona of a person in spatial Facetime on Apple Vision Pro.
Apple

The avatar will replicate your facial expressions, eye movements, and gestures to appear more natural. This solves the problem of constantly holding your iPhone in your hand or facing your Mac’s screen while on FaceTime — or appearing like the looney tech enthusiast who likes to stay unhinged while trying out the latest gadgets.

For Apple, this is also a tangible way to attract more people to get on board with its nascent ideology of spatial computing — if not for any other reason but to avoid the fear of missing out (FOMO). But besides convincing existing Apple users to spend $3,500 on a piece of emerging technology, there’s another way Apple can make these hyperrealistic avatars feel more commonplace — by bringing them to the iPhone and iPad.

They shouldn’t be exclusive to Vision Pro

Spatial FaceTime on Apple Vision Pro headset
Apple

There are good reasons why Virtual FaceTime avatars are suited for other Apple devices just as well as the Vision Pro.

Firstly, bringing spatial FaceTime to all devices aligns with Apple’s proposed new age of “spatial computing.” Until a tangible future when augmented reality completely takes over other interactions with computers and especially replaces our beloved smartphones, Apple can use this opportunity to improve familiarity with the technology.

Besides being in line with eliminating boundaries between the real and the virtual worlds, offering a spatialized FaceTime experience sits right within Apple’s principles of an ecosystem, where all devices — at least the top-tier ones — work similarly and in congruence with each other. Renowned analyst Ming-Chi Kuo also affirms the need to establish harmony between the Vision Pro and other devices within the Apple ecosystem, citing it as critical to the headset’s initial success.

Furthermore, to drive a converging software experience, Apple will work towards bolstering the hardware on its other devices, specifically the iPhone. Naturally, the iPad (at least the Pro variant) will follow suit. This makes the iPhone and the iPad Pro worthy candidates to share experiences with the Vision Pro, and a spatial FaceTime experience tops the list of our expectations.

Apple has the tricks up its sleeves already

Apple Animoji and Memoji on an iPhone.
Digital Trends

Apple has already laid the groundwork for this transition to realistic avatars, putting years of work into creating and improving Animoji. These large animated emojis on Apple devices mimic a person’s facial expressions and head movements.

If you’re unfamiliar, Apple also offers a specialized version of Animoji — called Memoji — which creates a 3D caricature of the iPhone user’s face. This caricature can then be used for personalized emojis, as animated reactions within iMessage, or even within — drumroll — FaceTime to veil their face.

It’s not all surprising when we think about Memoji being the harbinger of Vision Pro’s animated avatars. Like the animated avatar on the Vision Pro, Memoji utilizes the TrueDepth camera on the iPhone or iPad Pro to track the movements of the head.

Besides Face ID and Animoji, Apple also uses the TrueDepth camera for additional camera features, such as studio lighting effects on the iPhone and the Center Stage feature on the iPad Pro. Additionally, it allows third-party developers to add a new dimension of data from the TrueDepth sensors to their applications through an API. It only makes sense for Apple to build upon these existing features to include support for an enhanced FaceTime experience similar to the Vision Pro.

Favoring our argument is also the fact that Apple hasn’t updated the visual appearance of Animoji in recent years. The last Animoji update close to the iPhone 14 launch had only added new animations and a few more facial expressions, leaving room for improvement.

The iPhone and iPad are ready

iPhone 14 Pro Max in hand.
Prakhar Khanna / Digital Trends

Besides the TrueDepth camera, another aspect in agreement with the idea is the hardware on Apple’s mobile devices. The majority of the iPhone and the iPad Pro models are driven by the same (or similarly) powerful hardware under the hood as on the Vision Pro headset, which packs Apple’s custom M2 silicon. The same can be seen on the MacBook Air, 13-inch MacBook Pro, and the 2022 iPad Pro.

Similarly, the A16 Bionic chipset that drives the iPhone 14 Pro and iPhone 14 Pro Max models is identical to the M2, although with slightly less processing power owing to a comparatively less powerful CPU and GPU. Despite a less powerful performance on the iPhone, it is nonetheless powerful enough to render a 3D version of you while reenacting facial expressions and hand gestures in real time.

With the A16’s successor, Apple is expected to make structural improvements to the chipset by moving to a much more efficient 3-nanometer (3nm) manufacturing process. Based on a set of rumored benchmark results, the A17 Bionic could further close in on the M2.

Mobile devices are the gateway to AR

Two people playing an augmented reality game on Apple iPad.
Apple

Besides using the iPhone’s TrueDepth camera to create virtual personas for FaceTime, the iPad Pro and iPhone Pro models are well equipped to replicate each participant’s tiles in FaceTime geometrically spread out on the Vision Pro. The iPad Pro (2020 and newer) and iPhone (12 Pro and newer) have featured a lidar scanner along with the cameras on the back.

The lidar (short for Light Detection and Ranging) scanner can accurately detect the distance from the iPhone and iPad Pro to any physical object by shining light on it and measuring the amount reflected back. This makes the lidar scanner enormously useful for augmented reality (AR) applications.

At previous iPhone and iPad launches, Apple has dedicated elaborate demos to AR games and apps. The technology’s extension to Apple’s own FaceTime makes the perfect amount of sense for it to extend spatial FaceTime beyond the Vision Pro.

Apple might still disappoint

A man wears Apple Vision Pro.
Apple

Despite our wishful thinking, we may not see spatial FaceTime on other devices any time this year. For Apple to bring the Vision Pro’s FaceTime experience to other devices before its launch, which is slated for “early 2024,” it would mean diluting the excitement and novelty associated with its futuristic headset. Not just that, it wouldn’t be unlike Apple either if these features are restricted to the Vision Pro for the sake of exclusively driving sales for the extravagant gadget.

If Apple does decide to bring spatial FaceTime to other devices, the earliest beneficiaries would be the 2024 iPad Pro or the iPhone 16. But if that turns out to be true, spatial FaceTime on iPhone 16 will benefit from an improved TrueDepth system that is speculated to be tucked away and hidden under the display.

I want spatial FaceTime on my iPhone

A person wearing Apple's Vision Pro headset.
Apple

Shared experiences are at the core of Apple’s philosophy, which makes us unreasonably hopeful about it bringing this key Vision Pro feature to the iPhone and other supported devices. While our expectation is partially braced by the sheer desire to experience FaceTime in a novel and exciting way without shelling out $3,500, it is also backed by the fact that recent iPhone and iPad Pro models sport the hardware necessary to make that dream a reality.

However, there’s an equal — or probably higher — chance that this does not happen, just so that Apple can compel more folks to share its vision (pun intended!) for the surreal yet digitally-woven future. After all, a single unit of the Vision Pro will bring roughly thrice as much revenue as an iPhone or iPad Pro to the cash-hungry cow. Rumors also point to a relatively cheaper headset with visionOS, but it will be far from being an SE-fied version that is significantly more affordable.

Perhaps, the best course of action right now is to wait until a few months into 2024. This will be the period when the Vision Pro will have (hopefully) already launched in the market and given Apple a chance — and ample metrics — to assess whether the headset can sell profitably well as a standalone device (with a definite bleeding edge) or needs support from its older siblings backed by a sense of familiarity.

Editors’ Recommendations

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums