iPhone Pro 12 – Survey device??

There is a buzz on the underground, geospatial people are tweeting about lidar in the new iPhone 12 Pro. What really escalated it all was Matterport releasing a new scan app on the Apple store, it was then followed by a few geospatial people alluding to be able to scan cities and even map rooms using this lidar built into the phone. Crazy right? It’s a solid state lidar.

This is where the lidar scanner is located

Reading about these claims got my heckles up and immediately some research needed to be undertaken to see if there was any evidence to back up these wild claims.

“NASA is using LiDAR technology for its next Mars landing. iPhone 12 Pro uses a LiDAR scanner to measure how long it takes light to reflect back from objects. So it can create a depth map of whatever space you’re in. Because it’s ultra-fast and accurate, AR apps can now transform a room into a realistic rainforest or show you how a new trainer will fit.” – quote from the iPhone website here.

Having read many articles, I cannot find any evidence of NASA using lidar in the same form and the iPhone 12 Pro, nor a reference for lidar in any other form. The nearest I got was a paper written last year which mentions semi-conductor lidar of the same type as the iPhone but in relation to orbital measurement (see page 274)

The scanner chip (Image courtesy of SystemPlus Consulting)

This encapsulates the problem here, if you go looking for specifications on the iPhone 12 Pro, you will see no detail on the lidar sensor used, in fact, Apple makes it almost impossible to find anything. The only thing you will notice is that there is no Time of Flight (ToF) sensor like in many of the other smartphones on the market. This is where the intrigue really starts. You see, many of the modern smartphones like the Samsung S20 Ultra and the Hauwei P40, use a “Flash LiDAR”, whereby a burst infra-red light is sent out and the time of return gives the depth. You might argue that this isn’t truly lidar.

The iPhone 12 Pro uses a CMOS Image Sensor (CIS) that provides something called “Direct Time of Flight.” This takes a single photon and spreads it into an array of 9×64 dots, which is then read by the Near Infra-Red (NIR) sensor and CIS to give a depth map from the return times. At this time it is rumored that the CIS is from Sony and the Vertical Cavity Surface Emitting Laser (VCSEL) is from Lumentum. 

VCSEL unit from Lumentum (Image courtesy of SystemPlus Consulting)

At present, although Apple is shouting about having lidar in its phone, the company is very coy about its capability. Apart from saying that it can help enhance image focus in low light, there is not much more said. 

Where this is focused is quite obvious, as already there are a slew of Augmented Reality apps flooding the Apple store. With this new lidar capability, it is a more precise calculation to provide occlusion (where things hide behind others) or to match augmented objects to surfaces, even with the few points of reference provided. This makes for an exciting time for the geospatial expert as Google has already released an augmented function within its own Android Auto and rumor has it that both Samsung and Google are looking to employ this lidar in their own camera systems.

It certainly opens the door for more realistic and convincing AR, especially mixed with the better accuracy GNSS units being incorporated in the new smartphones.

Can you use it for lidar scanning though? Probably not, although the chip employs fantastic machine learning and a 50 percent faster chip than some of its rivals, there wouldn’t be enough points over enough range to provide the detail many would wish for. But at this moment there is not enough information to be absolute, it is predominantly designed for providing the best photography. 

How the lidar references surfaces

One thing is for sure, a lidar war has started, with all mobile phone providers looking to create the best solution. This can only mean that over the next few years, the technology will move from assisting the camera and providing AR mesh, to actually capturing detailed 3D environments that we so dearly wish for.

Leave a Reply

Your email address will not be published. Required fields are marked *