Therefore, the dual camera system used by iPhone 7 Plus is likely to be based on LinX array camera technology. It is worth noting that this technology can not only be used to take higher quality pictures, but also realize depth information perception (there are many ways to realize it, among which RGB dual camera recognition is a common one, and both Intel and Microsoft adopt infrared cameras). It is often used for 3D scanning and scene recognition in AR.
When the depth information of an object can be easily measured by using a mobile phone camera, and a three-dimensional map can be constructed (the following figure is an example of LinX, which uses a mobile phone camera to recognize the three-dimensional information of a face), Apple can use the iPhone to realize the superposition of computer-generated enhanced images into a real scene (the goal of the Google Tango project).
Pokemon GO, which was very popular some time ago, is an ar mobile game, but because it uses an ordinary smartphone, the pokemon generated by it can't really interact with the scene in the display screen (because it can't perceive the 3D information of the environment). Now, with dual camera technology, a large number of iOS application developers can use their intelligence to build more AR software for the iOS platform. This time, the software is no longer the useless gimmick in the past, but a really fun and practical augmented reality application. So what applications can we see with this technology?
The ability to measure the distance (depth information) of objects in the real environment is very close to our human eyes. Humans with normal eyes can know how far the pillars and walls in front of us are, and we can subconsciously bypass these obstacles. But people with visual impairment and computers in the past can't do it. So we can develop such an application, which can help people with visual impairment to bypass the obstacles in the room and locate the specific location of items through audio.
Another application is to detect and make three-dimensional maps of houses. Dual cameras can accurately measure the distance between different objects through parallax calculation. By scanning with iPhone in hand, we can build a complete indoor three-dimensional space. Imagine previewing the location of the new sofa in the room online with your mobile phone. Imagine putting the iPhone in a mobile phone box and entering a virtual space like your room.
This technology also has many applications in entertainment and social interaction. For example, Pokemon is no longer simply suspended in the mirror image taken by the mobile phone. They will hide under the table, hit the corner of the table and jump on the sofa to wipe their foreheads. Or in social applications, you can invite friends into your virtual room.
Because dual cameras can identify the distance of objects, they can also be used to realize gesture recognition. Therefore, iPhone can have gesture tracking ability like HoloLens and Leap Motion, which is very important for interacting with virtual objects in VR and AR.
For head-mounted displays and mobile phones, VR and AR are two interrelated but essentially different technologies. VR pursues complete immersion in the virtual environment, and everything in VR is virtual, while AR superimposes virtual objects on the real environment and pursues the interaction between virtual objects and the real environment.
Although all this is only our imagination in the final analysis, dual cameras do have this application potential. Perhaps at this stage, Apple did not intend to give the iPhone the function of depth perception, but it led the trend. Then more and more devices will use dual cameras. When the mobile phone around us has such potential, just like the impact of cardboard on VR, AR will really break out.