Apple's authorized patent describes the technology of using eye gaze to interact with HMD. According to some embodiments, the user can select a text input field displayed on the HMD display screen with eyes.
This technology provides a more natural and effective interface, and in some exemplary embodiments, allows the user to recognize the text to be input mainly by using eye gaze.
These technologies are beneficial to virtual reality, augmented reality and mixed reality equipment and applications. These technologies can also be applied to traditional user interfaces, such as desktop computers, notebook computers, tablets and smart phones.
Fig. 2 of the Apple patent describes a top view of a user (#200) whose eyes are focused on an object (#2 10). The user's line of sight is defined by the visual axis of each eye of the user (as shown by rays 20 1A and 20 1B). The direction of the visual axis defines the user's gaze direction, while the distance of axis convergence defines the gaze depth.
The gaze direction can also be called gaze vector or line of sight. In fig. 2, the gaze direction is the direction of the object, and the gaze depth is the distance d with respect to the user. Gaze direction and/or gaze depth are features used to determine gaze position.
In some embodiments, the center of the user's cornea, the center of the pupil and/or the center of rotation of the eyeball are used to determine the position of the visual axis of the user's eyes. Therefore, these data can also be used to determine the gaze direction and/or gaze depth of the user.
In some embodiments, the gaze depth is determined according to the intersection of the visual axes of the user's eyes (or the position of the minimum distance between the visual axes of the user's eyes) or some other measurement data of the focus of the user's eyes. Alternatively, the gaze depth is used to estimate the distance at which the user's eyes are focused.
Figure 4 of Apple's patent illustrates a head-mounted display device (HMD) with a built-in gaze sensor. Users will be able to view a table in the VR world or the real world environment and direct their text input to a specific area of the table, because the gaze sensor can detect the user's focus. This technology is very accurate, and the tiny gaze gap from the "name" input slot to the next box "surname" can be accurately detected, so that users can fill in the box without a mouse.
Apple pointed out that the gaze sensor (#4 10) is user-oriented, and captures the features of the user's gaze during operation, such as the image data of the user's eyes.
In some embodiments, the gaze sensor includes an event camera that detects event data from a user (e.g., the user's eyes) according to changes in the detected light intensity over time, and uses the event data to determine the gaze direction and/or gaze depth.
HMD uses image data and event data to determine gaze direction and/or gaze depth. Alternatively, HMD uses ray projection and/or cone projection to determine the gaze direction and/or gaze depth. In some embodiments, multiple gaze sensors are used.
AR also has patents from other big companies.
Last weekend, a new patent for Google smart glasses was exposed. Google pointed out in their patent documents that smart glasses can add information next to what the wearer sees through glasses. Information (such as digital images) can be superimposed on the user's field of vision through intelligent optical devices, such as optical head-mounted display (OHMD), transparent head-up display (HUD) embedded with wireless glasses or augmented reality (AR) devices. Modern smart glasses are actually wearable computers that can run independent mobile applications. Some people can communicate with the Internet through natural language voice commands without using their hands, while others can use touch buttons.
In addition to Google, last Thursday, the World Intellectual Property Organization also announced a patent for Samsung's AR smart glasses. The patent focuses on how their glasses will integrate communication antennas.
Generally speaking, ar Company is working hard now, expecting more breakthroughs and surprises, and can enter the general C terminal as soon as possible.