Last week the U.S. Patent and Trademark Office officially published a patent application from Apple that relates to future smartglasses and, more specifically, to methods and techniques for managing smartglasses having additive displays for use in computer-generated reality environments.
To conserve battery life, Apple has designed a pair of smartglasses that provides a primary and secondary form of display that relays different kinds of information. In fact, there's a there's a third display, a tertiary display, in the form LEDs that surround the inner frame of the glasses that act as an indicator system that will become more evident in some of the patent figures presented further below in our report.
Apple notes that the HMD device will include both a primary display that extends across a field-of-view with a first resolution, and a secondary display that is physically and electronically coupled to the primary display that has a second resolution that is lower than the first resolution. Displaying a first portion of a virtual object is done via the secondary display and then displayed in the primary display when an object (e.g., app) is called up to interact with.
Apple's patent 2A-2B below depict an exemplary system that includes a device that has additive displays for use in various computer-generated reality environments.
In the next series of patent figures Apple illustrates how the tertiary display in the form LEDs surrounding the inner frame of the glasses act as an indicator system. In the example below, the user is looking for his keys. The LED display will light up portions of the lights indicating the direction of where his keys could be found. This could only work if the keys are attached to an AirTag device. While the patent doesn't spell that out, there's no other way for the smartglasses to know where the item being sought could be found.
In patent figure 17A below we see how apps could be called up when needed in the lower resolution display area. When the user decides to check new messages or emails, a first message will appear in the higher resolution display for easy reading. Users will have to touch a sensor to call place a message in the primary display or could work with Siri. In the case of a health app is present, the LED display could light up if the app determines the user is undergoing a heart problem that needs attention. For instance, the LEDs would light up red.
In patent FIG. 19B above we could see the glasses illustrate a vehicle with an outline around it. The outline would point out to the user which car happens to be their Uber ride. The visual aid could be for one car to pulsate or enlarge to indicate which is your ride.
In patent FIGS. 23A and 23B we see that the secondary (outer) display being used as a stopwatch or timer which is handy when working out or cooking.
In patent FIG. 22B above you'll see a FaceTime call. Then the patent describes how the LED display could be used to indicate the mood of the caller (happy, sad, depressed) with different colored lights. It's difficult to know why that could be of assistance. Apple describes algorithms are used to determine a mood. I think most adults can tell someone's one mood by their demeaner – though it's an option for those who are completely daft.
Translated from: patentlyapple