iPhone X - it will be interesting
/A few years ago I spent a lot of time playing with the Microsoft KInect. I even wrote a book about it. The Kinect contains a special “depth camera” where pixels don’t give you light intensity, they give you distance. The depth camera works by viewing an array of dots which are projected onto the scene in front of the sensor. Software works out the distance between the dots that the camera sees. The further apart that the dots appear, the further away that part of the scene is.
As far as I know Apple have put a Kinect sensor on top of the iPhone X which works in the same way. The depth information returned by the camera is a big chunk of how the phone recognises its owner.
I really hope it works.
The Kinect worked great for middle distance readings, but if you got too close all the little dots smudged together into one big dot and the camera stopped working. It would struggle a bit in very bright conditions and I never tried to use it to recognise my face when I was wearing glasses.
I’m always tempted to get the very latest in technology. I think it stems from my much younger days, when I used to enjoy owning my own copy of the current number one music single, but I think I’ll sit this one out until I’ve seen it work. With glasses.