Illustration for article titled What Is LiDAR and Why Would You Want It on Your Phone?

Image: Apple

It’s in the new iPad Pro tablets, it’s rumored to be coming to the iPhone 12 and a form of it has been appearing in Android phones for the last few years. What exactly is LiDAR, and why would you want it in a mobile device? Here we’ll take you through what the tech does, all of the ways it can be deployed, and why it might soon become as common as cameras in smartphones.

Advertisement

If you have come across LiDAR before—that’s “light detection and ranging”, or “laser imaging, detection, and ranging”, or just “light and radar”, depending on your preference—it may well be in connection with self-driving cars. It’s one of the key technologies currently being used to help vehicles work out where they’re going and what’s around them.

The history of LiDAR goes back much further than autonomous cars though. The technology was properly birthed in the 1960s, originally intended to track satellites and military targets, and with the same basic idea that’s behind the LiDAR technology of today: Using light to track the position of objects.

Advertisement

By measuring how quickly light—specifically laser light—takes to hit something and come back again, the position of that object can be determined. That’s how the very first LiDAR systems worked, and it’s what’s happening on the latest tablets from Apple too.

The 1980s saw substantial improvements in LiDAR technology, and infrared laser systems began to be commonly used to map out buildings and terrain using aircraft. Those same techniques are in use today, able to get measurements on everything from ocean depths to hidden Mayan settlements.

By registering not just the time that laser light takes to return, but also the angle that it’s reflected at, LiDAR data can be combined with other information to produce very accurate 3D maps. Professional LIDAR systems are often combined with GPS units, another technology that’s now commonplace in smartphones.

Advertisement

LiDAR is also very useful for self-driving cars, as we’ve mentioned (though Elon Musk isn’t a fan). LiDAR scanners atop vehicles can ‘see’ objects in real time, even those that you can’t see yourself, creating what’s known as a point cloud—a three-dimensional map of points in space that your autonomous car can then avoid.

Certain LiDAR configurations are even sensitive enough to detect pollutants in the air, or capable of monitoring traffic flow in an airport. What all these applications have in common is that they rely on a delicate balance of data collection and data analysis, which is a useful framework to bear in mind when it comes to tablets and smartphones.

Advertisement

And so back to mobile devices, and to Apple who is the first to use the word LiDAR to describe its newest depth-sensing sensors. iPhones and iPads have had depth sensors for years now (which is how they can blur backgrounds in photos taken with Portrait mode), but LiDAR takes it to a whole new level.

Illustration for article titled What Is LiDAR and Why Would You Want It on Your Phone?

Image: Apple

Advertisement

According to Apple, the LiDAR scanner inside the new iPad Pros can work on the level of individual photons of light, at a distance of up to five meters (over 16 feet), and at speeds that go into nanoseconds (so a scene can be captured in the blink of an eye). No doubt the A12Z Bionic chipset is doing a lot of heavy lifting too, in terms of interpreting the data—you’re unlikely to see LiDAR make it to any budget phones in the near future.

The biggest differences are going to be seen in one of Apple’s favorite fields, augmented reality. The Measure app that now comes with iOS and iPadOS, for example, is quicker, more accurate, and more granular once LiDAR is involved—you can use it as a serious measurement tool on the iPad Pro, not just a novelty that gives good approximations of length, depth and height.

Advertisement

LiDAR means that for the first time, an Apple device can map out an environment in detailed 3D, just like planes have been mapping out oceans and mountains for years. We’re not just talking Minecraft on top of a table, we’re talking about Minecraft on the table, across the floor, and around your cat, without a glitch.

When it comes to AR apps that can drop furniture right into your living room, these objects will look much more like part of the existing space, bumping up against existing objects. Or take AR games, which will be able to feature characters appearing from behind corners and over the top of fences in a more realistic way than ever before.

Advertisement

Illustration for article titled What Is LiDAR and Why Would You Want It on Your Phone?

Photo: Sam Rutherford (Gizmodo)

Object placement will be more accurate and authentic-looking, real-time motion capture will be more comprehensive, and physical objects in the middle of scenes will be handled better by whatever AR app you’re using (the improvements that LiDAR brings, where available, are automatically rolled into Apple’s ARKit framework).

Advertisement

For now, the processing required for LiDAR and the sensors needed inside a phone are going to restrict the technology to top-end devices, but as with any mobile tech, it should get cheaper and more practical over time. If Apple sticks with the technology, expect the range and accuracy to get better with each passing year.

That’s not to say LiDAR on phones is an inevitability: Other depth-sensing technologies are available, and artificial intelligence will continue to get better at sensing depth using standard camera sensors. LiDAR can operate in all kinds of lighting conditions though, very quickly and very accurately, which is part of the reason Apple is betting on it.

Advertisement

But the form of LiDAR found in Apple’s latest iPad is very different from what many top-end Android phones are using. The depth sensor used by many Android phones is formally called a time-of-flight or ToF sensor, which, for many intents and purposes, is LiDAR. Like LiDAR a ToF sensor uses reflected light to gauge distances for camera effects and AR. But it’s a scannerless LiDAR system, relying on a single pulse of light to map an entire space, while Apple is using scanner LiDAR, which uses multiple points of light to take these readings much more frequently and with greater accuracy.

Even if smartphones have lost the ability to wow us, 13 years on from the first iPhone, they can still get faster, smarter and more capable, and LiDAR is proof of that.

Advertisement