Welcome to join by clicking hereAR/VR DirectoryCurrently, there are more than 3000 members, including companies such as GoerTek, HTC, OPPO, Skyworth, PICO, ByteDance, Black Shark, Lenovo, Nreal, Lynx, Luxon, Lingxi MicroLight, Luxshare Precision, Leading Ideal, OFILM, Huaqin, Wentai, Luxshare, Lumus, Sisvel, and Shunyu, among others. Click on the keywords below for filtering.

11 AAA 封面 VISION PRO 相关

Recently,the U.S. Patent and Trademark Office officially published a series of six patent application from Apple relating to Vision Pro covering Spatially Aware Camera Adjustments, Depth Conflict Mitigation in a 3D Environment, Gaze Trackers and more.

In the main patent application that we're covering today, Apple notes that electronic devices such as head-mounted devices may have sensors. For example, a head-mounted device may have a camera for capturing images of the environment surrounding the head-mounted device. The camera can operate in an automatic exposure mode that automatically adjusts the exposure settings of the camera depending on the current light levels in the captured images.

The head-mounted device can be mounted on a user's head. The lighting level of the images can change quickly as the user turns their head. Consider a scenario in which the user is initially facing a dimly lit desk and then turns to face a bright sunlit window. Upon detecting the much brighter illumination level of the sunlit window, the camera can automatically adjust the exposure settings of the camera to compensate for the change in brightness. There can, however, be some time lag between when the user turns their head and when the camera exposure adjustments take place. Such time lag and also flicker can result in visual discomfort for the user. Apple's patent remedies this.

Head-Mounted Device With Spatially Aware Camera Adjustments

Displays may be used for presenting a user with visual content. Cameras may be used to capture visible light and infrared images. Cameras may be used, for example, to provide a user with real-time pass-through video, to track hand gestures and other body movements, and to track movement of the head-mounted device relative to the environment surrounding the head-mounted device.

The camera providing the real-time pass-through video to the user is sometimes referred to as the main camera or the pass-through camera. The cameras used for tracking movement of body parts or the movement of the head-mounted device relative to the environment can be referred to as secondary (supplement) cameras or tracking cameras.

A head-mounted device may be operated in a real-world environment on a user's head. The real-world environment can have areas with different illumination characteristics. For example, when the head-mounted device is facing a first area of the environment that is dimly lit by an incandescent lamp, the pass-through camera can be operated using first camera settings suitable for the first area with the first lighting characteristics.

When the head-mounted device is facing a second area of the environment that is brightly lit by daylight, the pass-through camera can be operated using second camera settings suitable for the second area with the second lighting characteristics.

Images captured using the main camera and/or other supplemental cameras can be used to construct a spatial (3-dimensional) map of the illumination characteristics of the real-world environment. This spatial map can be gathered over time using one or more cameras within the head-mounted device as the user looks around a 3-dimensional environment (e.g., to build a persistent lighting model of the environment based on historical data).

The secondary (tracking) cameras and any 6 DoF (degrees of freedom) tracking systems (e.g., tracking systems that can be used to monitor both rotational movement such as roll, pitch, and yaw and positional movement in the X, Y and Z axis in a 3D environment) can be used to determine the user's current head pose and therefore where the main camera is currently pointing or will be pointing.

These positional tracking systems can include inertial measurement units, simultaneous localization and mapping (SLAM) units, time-of-flight measurement units, stereo camera, and other motion sensors. To ensure minimal delay (lag) between adjustment from the first camera settings to the second camera settings, information about where the camera is pointing within the 3D environment and information about the illumination levels and lighting characteristics in the environment can be used to obtain control signals for adjusting an exposure setting for the main camera, a color balance setting for the main camera, a tone curve mapping setting for the main camera, and/or other adjustments to the main camera.

The information about the illumination levels and lighting characteristics in the environment can be gathered over time (e.g., to build a lighting model based on historical data) or can be determined in real time using the secondary cameras, one or more ambient light sensors, flicker sensor, and/or other optical sensors, which can have wider field(s) of view than the main pass-through camera and can therefore be used to detect lighting characteristics outside the field of view of the main pass-through camera.

Proactively adjusting the camera settings in this way can help minimize delay in the camera settings adjustments and also reduce flicker when the user looks at different areas of the environment with different lighting characteristics and can thus help reduce visual discomfort for the user. 

Apple's patent FIG. 3 below is a cross-sectional side view of a head-mounted device #10 (such as Vision Pro) in an illustrative configuration in which the device includes different types of sensors for monitoring the environment surrounding the headset.

Vision Pro may also include other cameras that can be used in tracking the positions and movements of external objects. As an example, tracking cameras may track a user's hand or the user's torso or other body part. Hand gesture input controls operation of Vision Pro. Body part monitoring may be used to allow a user's body motions to be replicated by content displayed in a virtual environment.

If desired, Vision Pro may also include cameras that are used in tracking the position of external accessories (e.g., the position and movement of controllers that are moved by a user to control aspects of the headset. In some scenario, visual inertial odometry (VIO) systems or other systems that determine the position, movement, and/or orientation of Vision Pro relative to the environment. Cameras may perform dedicated functions (tracking, visual inertial odometry functions, scene capture, ranging, three-dimensional image capture for facial recognition and environment mapping, etc.) or two or more of these operations may be performed by a shared camera.

2x-Apple-patent-figs-空间感知相机调整-patentlyapple-

Apple's patent FIG. 5 above is a block diagram showing different hardware and/or software components within Vision Pro for adjusting one or more image settings for the pass-through camera #50.

Translated from: patentlyapple

艾邦建有AR/VR产业链微信群,目前有HTC、PICO、OPPO、亮亮视野、光粒科技、影创、创维、佳视、歌尔、立讯精密、多哚(纳立多)、欣旺达、耐德佳,联创电子、至格科技、灵犀微光、舜宇光学、广景视睿、珑璟光电、京东方、海信视像、科煦智能、阿科玛、金发科技、思立可、新安天玉、四方超轻、大族激光、发那科、承熹机电等加入,也欢迎大家长按下方图片识别二维码加入微信群:
Document Download
Welcome to join by clicking hereAR/VR DirectoryCurrently, there are more than 3000 members, including companies such as GoerTek, HTC, OPPO, Skyworth, PICO, ByteDance, Black Shark, Lenovo, Nreal, Lynx, Luxon, Lingxi MicroLight, Luxshare Precision, Leading Ideal, OFILM, Huaqin, Wentai, Luxshare, Lumus, Sisvel, and Shunyu, among others. Click on the keywords below for filtering.
en_USEnglish