At a demo, one game developer showed me a game his company built for the Spectacles. It tracks how far you walk and overlays a gamified grid over the top of your surroundings. As you walk, you collect coins that add up over your route. RPG-style enemies will pop up occasionally too, which you can then fight off with an AR sword that you wield by waving your hand around in real life. You have to hold the sword out directly in front of you in order to keep it within the confines of that narrow field of view, though, so that means walking with a stiff, outstretched arm. The pitch is that you can play this game while walking, which seems to me like a good way to accidentally whack somebody else walking on the sidewalk or get hurt when you chase a coin into traffic.
Snap encourages wearers to avoid using AR that blocks their vision at times when they shouldn’t be distracted, and to pay attention to their surroundings. But there are no procedures in place on the Spectacles now that send a pop up warning when something is in the way, or prevent people from using the glasses while driving or operating heavy machinery.
People have been grievously injured while distractedly playing Pokémon Go, but Snap says this is a different use case. Holding your phone directly in front of you to catch a rare Snorlax is a problem because then you’re blocking your vision with a device. The Spectacles let you see the real world at all times, even through the augmented images in front of you. That said, I found that having a hologram in the middle of my vision can definitely be a distraction. When I tried out the walking game, my eyes focused more on the little cartoon collectibles floating around than the actual ground ahead of me.
This might not be a problem while the Specs are solely at the hands of a few developers. But Snap is moving quickly, and also wants to appeal to a wider array of buyers, likely in an effort to build up its tech before its rivals can run away with the AR prize.
After all, Meta’s AR efforts seem to be further along than Snap—lighter frames, more robust AI on the backend, and ever-so-slightly less of an off putting look. But there are some key differences between how the companies are trying to push their burgeoning tech forward. Meta’s Orion glasses are actually controlled by three devices—the glasses on your face, a gesture sensing wristband, and a large puck—about the size of a portable charger—that does the bulk of the processing for all the software features. Unlike Meta’s glasses, Snap’s Spectacles are all packed into a single device. That means they are bigger and heavier than the Meta glasses, but also that users won’t have to carry around extra pieces of equipment when they finally make their way into the real world.
“We think it’s interesting that one of the biggest players in virtual reality agrees with us that the future is wearable, see-through, immersive AR,” Myers says. “Spectacles are quite different from the Orion prototype. They’re unique in that they are real immersive AR glasses that are available now, and Lens Studio developers are already building amazing experiences. Spectacles are completely standalone, with no extra puck or other devices required, and are built on a foundation of proven, commercialized technology that can be produced at scale.”
Snap’s goal is to make its Spectacles intuitive, easy to use, and easy to wear. It’s going to take a while to get them there, but they’re well on that path to those three points. All they have to do is shave off some weight. Maybe add some color. And keep people from wandering into traffic.
Services Marketplace – Listings, Bookings & Reviews