SPIE AR | VR | MR is the most important conference on optical architectures (optics, displays, sensors) for the next generation of smart glasses and head mounted displays. We look forward to seeing your research on optical systems, subsystems, and/or the technological building blocks which will enhance virtual reality (VR), augmented reality (AR), and mixed reality (MR) experiences. We invite researchers and engineers from academia and industry to present and discuss recent developments in this rapidly advancing field.

Papers will be accepted in these areas:


SPIE AR | VR | MR STUDENT OPTICAL DESIGN CHALLENGE
See details here: Optical Design Challenge

Abstracts submitted to the Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR) V conference may be eligible to apply to the 2024 SPIE AR | VR | MR Student Optical Design Challenge. The challenge is designed to encourage and acknowledge excellence in oral and poster student presentations and bridge the gap between traditional academic optics and industry expectations for today’s immersive display products. The challenge primarily consists of an adjudicated presentation “pitch” in which the participants present their design with an explanation of how it best overcomes the selected design challenge.

Eligibility
The SPIE AR | VR | MR Student Optical Design Challenge is open to any full-time student registered and accepted for presentation at SPIE AR | VR | MR who is enrolled at an academic or research institute, performing their work either in an academic lab, a research institute, or as an internship in an external company.

How To Apply:
If you qualify and would like to participate in the 2024 SPIE AR | VR | MR Student Optical Design Challenge, enter "Challenge" in the Custom Tracking Field when you submit your technical conference abstract. SPIE will reach out to you after submissions are closed to confirm your participation and give you more information on how to participate.

Frequently Asked Questions:
  • Will I be required to present in the conference or can I just participate in the challenge?
    In order to be eligible for the SPIE AR | VR | MR Student Optical Design Challenge, the presenter must also register, attend, and give a technical talk in the SPIE Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR) V conference.
  • What software can I use?
    The optical design software is at the choice of the student. However, participating sponsors may be offering their optical design software available for use to any student registering in this challenge. More details will be announced in the instructions sent to participants.
  • Can I have co-authors?
    Yes, any number of them, from either academy or industry, but the primary author /presenter must be a registered full-time student.


;
In progress – view active session
Conference 12913

Optical Architectures for Displays and Sensing in Augmented, Virtual, and Mixed Reality (AR, VR, MR) V

29 January 2024 | Moscone Center, Room 3010 (Level 3 West)
View Session ∨
  • 1: Diffractive Combiner Technologies
  • 2: Multiplexed and Switchable Displays
  • 3: Visual System Centric Displays
  • 4: Combiner Design and Simulation Techniques
  • 5: Vergence Accommodation and Perception Improvements
  • 6: Eye, Hand, and World tracking
  • 7: Next Gen Micro Displays I
  • 8: Next Gen Micro Displays II
  • 9: Fabrication and process for XR technology
  • 11: Image Quality and Perception
  • 12: Metrology and Measurement Tools I
  • 13: Metrology and Measurement Tools II
  • Optical Design Challenge
  • Posters
Session 1: Diffractive Combiner Technologies
29 January 2024 • 8:00 AM - 10:00 AM PST | Moscone Center, Room 3010 (Level 3 West)
Session Chair: Hong Hua, Wyant College of Optical Sciences (United States)
Session 1 will run concurrently with Sessions 5 and 9
12913-1
Author(s): Wei-Ting Hsu, Yu-Chieh Cheng, National Taipei Univ. of Technology (Taiwan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
This study introduces a Maxwellian view system for AR near-eye displays using LBS light engine and TiO2 metasurface waveguide. It overcomes accommodation-vergence conflict, offering benefits like depth of field, high definition, compact structure, wide FOV, high brightness, low power consumption, high contrast, and high resolution. The research utilizes a lightweight TiO2 metasurface to address bulkiness while ensuring long-term wear comfort. By combining LBS with the metasurface, collimated light within a 40-degree FOV is efficiently redirected to the pupils, resulting in clear retinal imaging. This study considers the required deflection angles of light rays for nine different screen positions in the LBS system when incident on the metasurface., achieving a maximum correction angle of 60 degrees through metasurface optimization. The overall design successfully demonstrates an FOV size of 40 x 22, encompassing LBS optics and metasurface.
12913-2
Author(s): Norbert Leister, Tim Wagner, Tobias Schuster, Martin Teich, Steffen Zozgornik, Hagen Stolle, SeeReal Technologies GmbH (Germany); Sara Frances-Gonzalez, Peter Dürr, Fraunhofer-Institut für Photonische Mikrosysteme IPMS (Germany)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Market penetration of automotive head-up displays (HUD) is increasing and extending from higher-end models to mid-range and even compact cars. New HUD use cases will be motivated by autonomous driving as well as improved HUD abilities. Holographic 3D (H3D) HUD will contribute to further improve the usability and attractiveness, specifically by presenting images with a full consistency of all depth cues. Based on results from H3D-HUD implementations, it is explained how H3D images are created and how viewers benefit from the unique 3D solution that eliminates ambiguity of the presented information and improves safety by properly overlaying virtual 3D objects with the real world. The difference between possible SLM options is explained, including references to ongoing development of high-resolution phase-modulating micro-mirror-based SLM (micro mirror arrays, MMA). It is compared how use of MMA vs. LCoS can improve selected features and the user experience. Ongoing development of MMA in a consortium of expert companies and institutions and results of already manufactured MMA are presented.
PC12913-4
Author(s): Zhexin Zhao, Xiayu Feng, Zhenye Li, Junren Wang, Mengfei Wang, Xinyue Zhang, Zheyi Han, Stephen Choi, Lu Lu, Barry Silverstein, Meta (United States)
29 January 2024 • 8:40 AM - 9:00 AM PST | Moscone Center, Room 3010 (Level 3 West)
Show Abstract + Hide Abstract
To make augmented reality (AR) eyewears widely available, it is necessary to provide a cost-effective and high-quality AR waveguide combiner solution. The polarization volume hologram (PVH) gratings made of liquid crystal (LC) polymer is a promising candidate with unique polarization properties. In this presentation, we review the physical properties of PVH and provide a thorough discussion about how the performance of the waveguide combiner is influenced by the characteristics of PVH and fabrication capabilities. Our study provides guidance to the development of PVH waveguide technology and promotes scale-up solutions.
PC12913-5
Author(s): Daniel Bacon-Brown, Matthew C. George, Rumyana Petrova, Stuart Johnson, Bradley R. Williams, MOXTEK, Inc. (United States)
29 January 2024 • 9:00 AM - 9:20 AM PST | Moscone Center, Room 3010 (Level 3 West)
Show Abstract + Hide Abstract
AR/VR headsets operating in visible wavelengths require extremely compact and lightweight optics to allow for all day wearable comfort and functionality. This means that visible metalenses would be ideal for this application. Visible metalenses have not been available for volume production until now. With proven visible metalens production the adaptation into applications such as AR/VR are the next step. At Moxtek, we have an established baseline visible meta-optic process line that provides greater than 90% total efficiency on a lot to lot average. Moxtek is uniquely positioned to support application development with design validation and production ready nanoimprint masters all processed on high volume tool sets.
12913-6
Author(s): Zhenyi Luo, Yannanqi Li, John Semmen, Shin-Tson Wu, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Offering an ultrathin formfactor and lightweight, diffractive liquid-crystal optics is a promising tool to design a more compact VR display system. However, the severe chromatic aberrations of this diffractive elements need to be solved before further practical applications. Here, we come up with an achromatic system consists of three diffractive liquid-crystal components to address this longstanding color issue. The phase and spectral response of each element are specifically designed to manipulate the polarization states of light and compensate the chromatic aberrations. A significant improvement in color performances has been demonstrated with both our simulations and experiments. Potential applications for metaverse, spatial computing, and digital twins that have found widespread applications in smart tourism, smart education, smart healthcare, smart manufacturing, and smart construction are foreseeable.
12913-131
Author(s): Nicole Tadros, Aaron Krieg, Jeremy Cade, Shubhangi Khadtare, Adeola Oyeyinka, Peter Guschl, Neil Pschirer, Serpil Gonen Williams, Pixelligent Technologies LLC (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Pixelligent’s high-refractive index PixNIL® nanocomposite formulations incorporating ZrO2, TiO2, or TiO2-ZrO2 core-shell nanoparticles showcase robust nanoimprinting capabilities for surface-relief gratings tailored for extended reality (XR) applications. Thin films exhibit refractive indices ranging from 1.7 to 2.0 @ 589nm while delivering more than 95% transmission and less than 0.1% haze, attributed to the excellent control over particle size, shape, and surface of the proprietary PixClear® and PixCor™ nanocrystals. PixNIL® ST13A provides a large soft-bake process window for thin films with a thickness of less than 300nm while extending the usable lifetime of working stamps to over 25 imprints. This paper presents a thorough evaluation of the novel nanoimprint formulation, ST13A, demonstrating significant advancements and introducing a new approach to address key challenges in nanoimprint technologies. The analysis encompasses crucial aspects of the nanoimprinting process, including replication fidelity and soft-bake process window.
Break
Coffee Break 10:00 AM - 10:30 AM
Session 2: Multiplexed and Switchable Displays
29 January 2024 • 10:30 AM - 12:10 PM PST | Moscone Center, Room 3010 (Level 3 West)
Session Chair: Guanjun Tan, Apple Inc. (United States)
Session 2 will run concurrently with Sessions 6, 10, and 11
12913-7
Author(s): Minseong Kim, Jae-Hyeung Park, Inha Univ. (Korea, Republic of)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
This paper introduces a mixed reality near-eye display configuration designed to concurrently achieve a wide field of view (FoV) and dual side display. The proposed system incorporates a dihedral corner reflector array (DCRA), a virtual image display unit, and an eye display unit. The eyepiece within the virtual image display unit is directly imaged onto the eye pupil plane by the DCRA, enabling the simultaneous realization of a wide FoV and a large eyebox. Furthermore, the DCRA projects the eye display unit toward the external environment, accurately presenting the user's eye image at its appropriate depth to the other people facing the user. The system's configuration and principles are validated through optical experiments.
12913-8
Author(s): Tianyao Zhang, Wyant College of Optical Sciences, The Univ. of Arizona (United States); Yushi Kaneda, Yuzuru Takashima, Wyant College of Optical Sciences (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Near-to-eye display (NED) has been widely used for augmented reality (AR) technology. Among them, holographic waveguide is one of the most promising solutions to accommodate field of view (FOV) and eyebox. However, FOV is seriously restricted (around 5 degrees) if the holographic material is used as in- and out-coupler. Here, we demonstrated the FOV expansion from 5 degrees to 42 degrees based on polychromatic light source which mostly approaches the FOV limitation of glass-based waveguide. It is achieved by utilizing multi-layer couplers or waveguides where each pair of them support portion of entire FOV.
12913-9
Author(s): Munkh-Uchral Erdenebat, Ki-Chul Kwon, Shariar Md Imtiaz, Chungbuk National Univ. (Korea, Republic of); Hak-Rin Kim, Kyungpook National Univ. (Korea, Republic of); Jong-Rae Jung, Suwon Science College (Korea, Republic of); Nam Kim, Chungbuk National Univ. (Korea, Republic of)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We propose an advanced holographic see-through display system with 3D/2D switchable modes based on a liquid-crystalline lens array and a one-shot learning model. The liquid-crystalline lens array switches its role that acts like a lens array or glass, according to the state of the electrical polarizer. When the switch of an electrical polarizer is on-state, the camera captures the image of a real-world object, and a one-shot learning model estimates the depth data from the captured image. Then, the 3D model is regenerated based on both color and depth images; and the elemental image array is generated and displayed on the microdisplay while the liquid-crystalline lens array reconstructs as a 3D image. On the other hand, when the switch of the electrical polarizer is off-state, the camera captures the image of a real-world object and is directly displayed by the microdisplay, while the liquid-crystalline lens array simply transmits it to the holographic combiner. The experimental results confirmed that the proposed system can be an advantageous way to implement the 3D/2D switchable holographic see-through system.
12913-11
Author(s): Yuqiang Ding, Zhenyi Luo, Univ. of Central Florida (United States); Garimagai Borjigin, University of Central Florida (United States); Shin-Tson Wu, Univ. of Central Florida (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Pancake lens has been widely used in mixed reality (MR) due to its compact formfactor. However, using a half mirror to fold the optical path results in a tremendous optical loss. To break this optical efficiency limit while keeping a compact formfactor, we present a new folded optical system incorporating a nonreciprocal polarization rotator. In our proof-of-concept experiment using a commercial Faraday rotator, the theoretically predicted 100% efficiency is validated. Meanwhile, the ghost images can be suppressed to undetectable level if the optics are with anti-reflection coating. Our novel pancake optical system holds great potential for revolutionizing next-generation MR displays.
12913-10
Author(s): Qian Yang, Yuqiang Ding, Shin-Tson Wu, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We present a novel design for a full-color, wide Field-of-View (FoV) Augmented Reality (AR) display, which ingeniously employs a single waveguide along with advanced polarization multiplexing reflective polarization holograms. This innovative approach surmounts the narrow FoV limitation of the present single-waveguide, full-color AR displays. The employed reflective polarization holograms are devised to operate with high efficiency across the entire visible spectrum, accommodating the incident angles required by the expanded FoV. This novel design represents a significant leap forward in AR technology, laying a foundation for immersive and vivid AR experiences.
Break
Lunch Break 12:10 PM - 1:30 PM
Session 3: Visual System Centric Displays
29 January 2024 • 1:30 PM - 2:50 PM PST | Moscone Center, Room 3010 (Level 3 West)
Session Chair: Guohua Wei, Meta (United States)
Session 3 will run concurrently with Sessions 7 and 12
12913-13
Author(s): Woongseob Han, Myeong-Ho Choi, Jae-Hyeung Park, Inha Univ. (Korea, Republic of)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We propose a novel design of the video see-through super multi-view near-eye display (VST-SMV-NED) using a waveguide-type light source and a spatial light modulator (SLM). The proposed method provides a monocular depth cue by projecting multi-view images into the eye pupil. The temporal multiplexing scheme applied in the proposed method enables our system to present multi-view images within a very short time(>60Hz). The multiple views of the real scene are captured with deep depth of field, fused with the corresponding view of the virtual scene, and displayed by the SLM.
12913-14
Author(s): Myeong-Ho Choi, Kwang-Soo Shin, Jae-Hyeung Park, Inha Univ. (Korea, Republic of)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We introduce a novel holographic foveated near-eye display. Two phase-modulating spatial light modulators (SLMs) are magnified with different magnitudes by polarization optics to give peripheral and foveal holographic 3D images. The main optical component of the proposed display is a geometric phase (GP) lens, which can give different optical power to the incident light according to its circular polarization state. Because there are leakage of the light or image deterioration while using commercial GP lens, however, camera feedback is used to apply phase pattern optimization which is loaded to SLM. Detailed explanation about system configuration, optimization algorithm, and results will be introduced in the presentation.
12913-15
Author(s): Alba M. Paniagua-Diaz, Juan Mompeán, Univ. de Murcia (Spain); Shoaib Shomroo, Voptica S.L. (Spain); Pablo Artal, Univ. de Murcia (Spain)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Spatial Light Modulators for phase modulation have been responsible of important advances in vision. However, due to their traditional large dimensions and elevated costs their applications have been mainly limited to research applications, up until recently when new approaches made the technology compact and affordable for the public. In this work we demonstrate the potential of a wearable visual corrector and simulator, where low and high-order aberrations are reproduced and corrected. Thanks to its see-through, compact and wearable configuration this approach open new possibilities for the non-traditional visual correction and simulation in smart glasses.
12913-70
Author(s): Rob Stevens, Prashanthan Ganeswaran, Oxford Optical Labs. (United Kingdom)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Optical systems for AR/VR headsets are designed for those who do not require vision correction. The reality, however, is that around 60% of the population have an ophthalmic prescription. This results in the user experience being compromised optically, even when the device has been designed to allow for glasses to be worn. Improvements to the optical performance can be taken further if the power of the objective lens is split over the two lenses, allowing greater control over aberrations and optimisation of the headset weight. These advantages are achieved for all users – including those who do not require vision correction.
Break
Coffee Break 2:50 PM - 3:10 PM
Session 4: Combiner Design and Simulation Techniques
29 January 2024 • 3:10 PM - 4:30 PM PST | Moscone Center, Room 3010 (Level 3 West)
Session Chair: Zhujun Shi, Meta (United States)
Session 4 will run concurrently with Sessions 8 and 13
12913-16
Author(s): Kalle Ventola, Hanna Lajunen, Toni Saastamoinen, Ismo Vartiainen, Juuso Olkkonen, Dispelix Oy (Finland)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Exit pupil expansion (EPE) is one of the three main grating functionalities in a diffractive AR waveguide combiner. The EPE grating produces multiple consecutive diffraction events for the in-coupled image signal, and via that creates an array of replicated pupils to the out-coupling region of the waveguide combiner. An inherent problem with the diffractive EPE system is the generation of interference which can significantly reduce the image quality of the waveguide combiner. We will present qualitative analysis of the EPE interference, simulated examples using the Dispelix in-house simulation software, and our strategies for finding solutions for the problem. By qualitative analysis and simulations, we demonstrate how the interference is created through a network of pupil propagation paths formed by the EPE grating. We briefly demonstrate how we can use qualitative understanding to mitigate the interference problem. Our in-house simulation tools enable powerful multi-variable optimization for finding novel EPE grating shapes and structures.
12913-17
Author(s): Katsumoto Ikeda, Yihua Hsiao, Michael Cheng, Ansys Japan K.K. (Japan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Diffractive optical devices are essential in developing compact and thin augmented reality (AR) devices. Surface-relief-gratings (SRG) and volume-holographic gratings (VHG) are common gratings with periodic material changes. VHG is relatively easy to manufacture, making it a popular choice for R&D teams developing AR exit pupil expander (EPE) applications. In the past, the Kogelnik algorithm was combined with the Ansys Zemax OpticStudio ray tracing engine to simulate VHG for AR applications. However, due to its more approximate calculations, the accuracy of this method is lower than that of the rigorous coupled wave analysis (RCWA) method. This study aims to investigate the theoretical differences between the Kogelnik and RCWA methods, implement their algorithms in practice, and compare the accuracy of the two methods for AR EPE applications using the Zemax OpticStudio ray tracing engine.
PC12913-18
Author(s): Wan-Pin Tsai, Yang-Kuan Tseng, Tsung-Xian Lee, National Taiwan Univ. of Science and Technology (Taiwan); Ching-Cherng Sun, National Central Univ. (Taiwan)
29 January 2024 • 3:50 PM - 4:10 PM PST | Moscone Center, Room 3010 (Level 3 West)
Show Abstract + Hide Abstract
Our study introduces a novel approach that combines Kogelnik coupled wave theory with BSDF to accurately characterize the diffraction phenomena exhibited by VHOE. By integrating this theoretical framework with ray tracing simulations, we enable the optimization of the system through a synergistic combination of simulated wave propagation and geometric optics. As a result, we have successfully identified and implemented viable solutions, leading to the development of an enhanced AR near-eye system that boasts improved efficiency, compactness, and superior image quality.
12913-19
Author(s): Shan-Ling Chen, Li-Wei Fu, Jiun-Woei Huang, National Taiwan Univ. (Taiwan); Kuang-Tsu Shih, PetaRay (Taiwan); Homer H. Chen, National Taiwan Univ. (Taiwan), PetaRay (Taiwan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
This paper considers a near-eye light field display comprising a light field generator, a collimator, and a geometric waveguide as the three main components. It takes 4-D light field data in the form of an array of 2-D subview images as input and generates a light field as output. The partially reflective mirrors of the waveguide replicate the optical path to achieve exit pupil expansion (EPE) and a large eyebox. In this work, we find that the light fields replicated by the partially reflective mirrors cannot perfectly overlap on the user’s retina, resulting in the appearance of multiple repetitive images—a phenomenon we call “ghost artifact”. This paper delves into the cause of this artifact and develops a solution for applications that require short-range interaction with virtual objects, such as surgical procedures. We define a working range devoid of noticeable ghost artifact based on the angular resolution characteristics of human eye and optimize the orientation of an array of partially reflective mirrors of the waveguide to meet the image quality requirement for short-range interaction. With the optimized waveguide, the ghost artifact is significantly reduced.
Session 5: Vergence Accommodation and Perception Improvements
29 January 2024 • 8:20 AM - 10:00 AM PST | Moscone Center, Room 3008 (Level 3 West)
Session Chair: Michael P. Browne, Vision Products LLC (United States)
Session 5 will run concurrently with Sessions 1 and 9
12913-20
Author(s): Yicheng Zhan, Univ. College London (United Kingdom); koray Kavaklı, Koç University (Turkey); Hakan Ürey, Koç Univ. (Turkey); Qi Sun, New York Univ. (United States); Kaan Akşit, Univ. College London (United Kingdom)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Multi-color holograms rely on simultaneous illumination from multiple light sources for reconstructing brighter 3D images than conventional single-color holograms. Our work introduces AutoColor, the first learned method for estimating the optimal light source powers required for illuminating multi-color holograms. For this purpose, we establish the first multi-color hologram dataset using synthetic images and their depth information. We generate these synthetic images using a trending pipeline combining generative, large language, and monocular depth estimation models. Finally, we train our learned model using our dataset and experimentally demonstrate that AutoColor significantly decreases the number of steps required to optimize multi-color holograms from >1000 to 70 iteration steps without compromising image quality.
PC12913-21
CANCELED: ChromaCorrect: prescription correction in virtual reality headsets through perceptual guidance
Author(s): Ahmet H. Güzel, Jeanne Beyazian, Univ. College London (United Kingdom); Praneeth Chakravarthula, Princeton Univ. (United States); Kaan Aksit, Univ. College London (United Kingdom)
29 January 2024 • 8:40 AM - 9:00 AM PST | Moscone Center, Room 3008 (Level 3 West)
Show Abstract + Hide Abstract
Globally, many individuals use prescription glasses to correct their impairments. Using prescription glasses with augmented and virtual reality (AR/VR) headsets could cause discomfort and inconvenience due to their bulk. Our study explores replacing the optical complexity of physical lenses with prescription correcting algorithms. The key insight of our method lies in deriving a perceptually accurate model that factors in display parameters, color, visual acuity, and user-specific refractive errors. Leveraging this perceptually accurate model, we use gradient-descent algorithms to optimize the displayed images. This way, our method provides an improved software prescription correction method concerning prior art without requiring prescription glasses. We tested our method on various displays, including VR headspaces, and verified our findings for image quality and contrast enhancements for simulated vision-impaired users.
12913-22
Author(s): Prashanthan Ganeswaran, Rob Stevens, Oxford Optical Labs. (United Kingdom)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Vergence Accommodation Conflict (VAC) is the mismatch between the convergence of your eyes and the focus of the crystalline lens (accommodation) within the eye, which is experienced in the usual VR/AR device configuration. Focus tuneable lenses can be used to move the focal plane of the display to match the plane of accommodation. Adjustable power lens designs have been proposed for this including Alvarez lens pairs. This presentation will assess approaches, outline their best suited applications, and detail the key requirements of an adjustable lens system for integration into VR and AR headsets.
PC12913-23
Author(s): Xuan Wang, HsienHui Cheng, Li-Min Chang, Lu Lu, Meta (United States)
29 January 2024 • 9:20 AM - 9:40 AM PST | Moscone Center, Room 3008 (Level 3 West)
Show Abstract + Hide Abstract
Liquid crystal photonic technology has been widely explored in VR systems giving its unique response for polarization. In this paper, we proposed a new time-multiplexed structure for foveated VR display using liquid crystal components. Unlike most other foveated display method, in which several displays are needed, a single display panel was used in the proposed architecture and the viewing optical system can be tuned between high optical power path to provide the wide FOV and low optical power path to achieve the high resolution through a switchable half wave plate(sHWP).
PC12913-24
Author(s): Nick Heijne, Cambridge Mechatronics Ltd. (United Kingdom)
29 January 2024 • 9:40 AM - 10:00 AM PST | Moscone Center, Room 3008 (Level 3 West)
Show Abstract + Hide Abstract
The Vergence-Accommodation Conflict (VAC) presents a well-documented and significant challenge in XR, causing visual discomfort and fatigue. It has been shown that dynamically changing the focus distance of the display enables eye’s vergence and accommodation to agree and has significant user benefits. Bringing focus adjustment to product has numerous challenges including power consumption, space constraints and added mass. We will present how Shape Memory Alloy (SMA) can be utilised to enable dynamic focal adjustment and how CML’s SMA actuators are particularly well-suited to achieve this with low power and low added mass in a manner suited for high-volume manufacture.
Break
Coffee Break 10:00 AM - 10:20 AM
Session 6: Eye, Hand, and World tracking
29 January 2024 • 10:20 AM - 12:20 PM PST | Moscone Center, Room 3008 (Level 3 West)
Session Chair: Naamah Argaman, Meta (United States)
Session 6 will run concurrently with Sessions 2, 10, and 11
12913-25
Author(s): Johannes Meyer, Michael Mühlbauer, Robert Bosch GmbH (Germany)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
High-speed eye-tracking is a key requirement for upcoming XR devices, as it enables novel applications such as display enhancement through gaze-contingent rendering or saccadic endpoint prediction. The update rate of state-of-the-art mobile video oculography sensors is limited by system-level power consumption, which is caused by the exponentially increasing computational power requirements for pupil detection w.r.t. update rate. To overcome this limitation and enable unconstrained high-speed eye-tracking with low power consumption, we propose to fuse eye movement velocity data acquired by laser feedback interferometry sensors at an outstanding sampling rate of 1 kHz with camera images acquired at lower sampling rates.
PC12913-26
Author(s): Yuchen Ma, Yunhui Gao, Jiachen Wu, Liangcai Cao, Zhengzhong Huang, Tsinghua Univ. (China)
29 January 2024 • 10:40 AM - 11:00 AM PST | Moscone Center, Room 3008 (Level 3 West)
Show Abstract + Hide Abstract
As the basis of virtual content creation, cameras are integral to augmented reality (AR) applications. However, the opaque nature of the camera's appearance can prevent it from being integrated into a transparent AR display. Here we introduced an integrated, compact, and flexible see-through camera, which enables crucial functionalities like eye gaze tracking and eye-position perspective photography, enhancing the immersive experience and interaction possibilities.
PC12913-27
Author(s): Pawel E. Malinowski, Vladimir Pejovic, imec (Belgium); Abu Bakar Siddik, Mahmoud Hamamou, imec (Belgium), KU Leuven (Belgium); Quazi Mohd Arman Uz Zaman, Itai Lieberman, Joo Hyoung Kim, Myung jin Lim, Luis Moreno Hagelsieb, Isabel Pintor Monroy, Wenya Song, Nikolas Papadopoulos, Zohreh Zahiri, Jonas Bentell, Steven Thijs, Naresh Chandrasekaran, Gauri Karve, imec (Belgium); Paul Heremans, imec (Belgium), KU Leuven (Belgium); Jiwon Lee, Hanyang Univ. (Korea, Republic of); David Cheyns, imec (Belgium)
29 January 2024 • 11:00 AM - 11:20 AM PST | Moscone Center, Room 3008 (Level 3 West)
Show Abstract + Hide Abstract
In this paper, we present thin-film photodetector (TFPD) image sensors for the short-wave infrared (SWIR) range. Monolithic integration of quantum dot (QD) absorbers enables quantum efficiency of 70% at 1400 nm and pixel pitch below 2 μm. We present image sensors on custom CMOS readout fabricated using 130 nm node. We review latest advancements on the photodiode stack and the pixel engine, including the thin-film pinned photodiode architecture. Furthermore, we study the manufacturing flows to realize full wafer capability for volume processing. QD image sensors are paving the way to add augmented vision into future XR systems with extra functionalities.
12913-28
Author(s): Chan-Yuan Huang, Chen-Han Lin, Homer H. Chen, National Taiwan Univ. (Taiwan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Hand tracking algorithms relying on a single camera as the sensing device can only provide relative depth information, resulting in limited practicality. This limitation underscores the necessity for effective and accurate estimation of the absolute distances between hand joints and the camera in the real world. We respond to this pressing need by introducing a methodology that exploits the autofocus functionality of a camera for hand tracking. It takes advantage of the unutilized potential of a camera and removes the need for additional power-demanding and costly depth sensors to accurately estimate the absolute distances of hand joints. Our methodology undergoes rigorous experimental validation and consistently outperforms traditional methods across different lens positions.
12913-29
Author(s): Sehoon Tak, Keunhee Cho, Jae-Sang Hyun, Yonsei Univ. (Korea, Republic of)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Estimating camera pose is one of the main obstacles to implementing Simultaneous Localization and Mapping (SLAM) technology. In this paper, we suggest a new phase-based SLAM technique for a single-camera single-projector Digital Fringe Projection (DFP) System. This method incorporates a second fixed global projector that projects fringe images and gray codes across a larger field of view. Utilizing conventional algorithms, the camera's position relative to the global projector is accurately calculated, compensating for error accumulation issues in traditional Visual Odometry (VO) methods. This eliminates the need for additional hardware or computationally expensive self-correction algorithms. Experiments show significant improvement in the accuracy of camera pose estimation while minimizing downtime. The proposed technique has the potential to replace VO as the primary method for camera pose estimation, allowing for seamless stitching of multiple images to create a precise 3D map.
PC12913-128
Author(s): Ori Weitz, IMMERSIX (Israel)
29 January 2024 • 12:00 PM - 12:20 PM PST | Moscone Center, Room 3008 (Level 3 West)
Show Abstract + Hide Abstract
Eye-tracking offers vast potential for enhancing the XR experience. Despite the maturity of the technology, challenges persist in achieving the desired level of accuracy, robustness, and population coverage. The challenges arise from the variability of external features of the eye and corneal reflections as well as the many intermediates between them and the fovea. Retinal eye-tracking is a fundamentally different approach, not bound by these constraints. We developed a compact, power-efficient retinal eye-tracking system designed for XR HMDs, circumventing classic limitations and unlocking new possibilities for immersive experiences.
Break
Lunch Break 12:20 PM - 1:30 PM
Session 7: Next Gen Micro Displays I
29 January 2024 • 1:30 PM - 2:50 PM PST | Moscone Center, Room 3008 (Level 3 West)
Session Chair: Naamah Argaman, Meta (United States)
Session 7 will run concurrently with Sessions 3 and 12
12913-30
Author(s): Philipp Wartenberg, Andreas Fritscher, Bernd Richter, Steffen Damnik, Florian Schuster, Dirk Schlebusch, Martin Rolle, Stephan Brenner, Johannes Zeltner, Uwe Vogel, Fraunhofer-Institut für Organische Elektronik, Elektronenstrahl- und Plasmatechnik FEP (Germany)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
It is commonly known for digital logic integrated circuitries that they heavily scale with downscaling of the process node enabling the integration of more functionality. However this was not true so far for OLED microdisplays which were basically stuck at 90nm…250nm process nodes for several reasons. This paper will present a new approach to realize a microdisplay architecture with extended functionality as well as a first implementation in 28nm utilizing very small pixels of 2.5um, a flexible interface architecture, framerates up to 10kHz along with integration of totally new driving schemes to serve the power and form factor requirements of upcoming near-to-eye systems in AR/VR/MR applications.
12913-31
Author(s): Aravind Lakshminarayanan, Jesse Richuso, Texas Instruments Inc. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
AR glasses display system need to be smaller, lower power, and higher performance to enable longer battery life, small lightweight products and compelling user experiences. The next generation of DLP Pico chipsets will be optimized for low power and small size while improving upon DLP technology’s industry leading performance. They will also include support for higher resolutions and video frame rates, reduced display latency, dynamic display dimming and panel self-refresh This paper intends to introduce the next generation of Texas Instruments DLP Pico technology and showcase how it is well positioned to enable low power AR displays
12913-32
Author(s): Nikolay Primerov, Callan Jobson, José Rios, Stefan Gloor, Nicolai Matuschek, Tim von Niederhäusern, Marco Rossetti, Antonino Castiglia, Marco Malinverni, Marcus Duelk, Christian Vélez, EXALOS AG (Switzerland)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We demonstrate a new generation of miniaturized, full-color RGB light source module with a collimated beam output for near-to-eye display systems, incorporating semiconductor laser diodes (LDs) for blue and green emission and a broadband superluminescent diode (SLED) for red emission. This new hybrid SLED-LD RGB light source module supports output power levels ranging from a few milliwatt up to 50 mW per color, in combination with collimated beams having a high circularization of better than 80% with large 1/e2 beam diameters around 1.1 mm, supporting high efficiencies and high resolution for waveguide-based AR glasses.
PC12913-33
Author(s): Tongtong Zhu, Kunal Kashyap, Poro Technologies Ltd. (United Kingdom)
29 January 2024 • 2:30 PM - 2:50 PM PST | Moscone Center, Room 3008 (Level 3 West)
Show Abstract + Hide Abstract
Porotech has developed a single pixel that can create a full range of colours simply by changing how the pixel is driven, called DynamicPixelTuning® (DPT®). This technology can be seamlessly incorporated into an already established market. Porotech has developed several products including full-colour projectors, microdisplay panels, AR glasses, and epitaxy wafers. The company’s panels offer higher resolutions as they are sub-pixel-free, and a longer lifespan due to the use of inorganic materials. Porotech’s projectors and AR glasses can approach currently available optic systems and waveguides, their tuneable wavelength signifies high adaptability to the current market, and the full colour projector is not limited by the type of waveguide used. Compared to the traditional three monochrome panels needed, which impede smooth mass manufacturing, only a single panel is required, meaning manufacturing ease and smaller projectors.
Break
Coffee Break 2:50 PM - 3:10 PM
Session 8: Next Gen Micro Displays II
29 January 2024 • 3:10 PM - 4:10 PM PST | Moscone Center, Room 3008 (Level 3 West)
Session Chair: Christophe Peroz, Google (Switzerland)
Session 8 will run concurrently with Sessions 4 and 13
12913-34
Author(s): Matan Naftali, Gaddi Yearim, Ran Gabai, Meni Yehiel, Gil Cahana, Maradin Ltd. (Israel)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
AR see-through applications challenge the display design to be bright enough for outdoor use on one hand and with minimal power consumption for prolonged usage on the other hand. Laser Beam Scanning (LBS) displays are perfectly suited for the task, utilizing lasers brightness. However, lasers are known to be “power hungry” components of such displays. Maradin presents a novel laser modulation scheme that enables the reduction of laser power consumption for see-through LBS displays, saves up to 70% of laser power, and enables high-resolution use of AR displays for an outdoor all-day application.
12913-36
Author(s): Douwe H. Geuzebroek, Brilliance B.V. (Netherlands); Raimond Frentrop, Ronald Dekker, Edwin J. Klein, Floris H. Falke, LioniX International BV (Netherlands); Anne Leenstra, PHIX Photonics Assembly (Netherlands)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
A laser engine is demonstrated based on a silicon-nitride PIC. It uses a flip-chip assembly process for hybrid integration of the Red, Green and Blue laser diodes, into etched recesses in the silicon nitride PIC and is based on passive alignment. The TriPleX® waveguide technology optimizes the ellipticity of the modes at the laser side from 1:4 to almost circular (1:1) at the output side. Output powers of >1mW are measured and depend on the laser diodes used. The described assembly approach facilitates wafer level laser diode integration and wafer level hermetic packaging for scalable volume manufacturing.
12913-37
Author(s): Marco Rossetti, Antonino Castiglia, Marco Malinverni, Adin Ferhatovic, Denis Martin, Marcus Duelk, Christian Vélez, EXALOS AG (Switzerland)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Laser beam scanners are of strong interest, as they offer a compact and low-power consumption solution for head-worn AR-displays. In this framework, EXALOS has been leading the research and development of active light sources, focusing on optimizing red, green, and blue edge-emitting devices, including superluminescent diodes (SLEDs) and LDs. This work provides an overview of the performance achieved by single-emitter SLEDs and low-threshold laser diodes based on GaN, as well as AlGaInP, III-V semiconductors. Furthermore, we report on the development of arrays. In particular, narrow-pitch devices with an emitter-to-emitter spacing smaller than 15 µm and also a novel device design featuring both anodes and cathodes on the top chip surface are reported. The individual cathodes are fully independent and electrically insulated, ensuring an emitter-to-emitter resistance greater than 1 MOhm. Compared to conventional arrays with common cathode, the new architecture allows for integration with industry-standard current-sink drivers for efficient multi-LD modulation.
Session 9: Fabrication and process for XR technology
29 January 2024 • 8:00 AM - 11:10 AM PST | Moscone Center, Room 3006 (Level 3 West)
Session Chair: Daniel K. Nikolov, Univ. of Rochester (United States)
Session 9 will run concurrently with Sessions 1 and 5
12913-38
Author(s): John C. Robinson, Alessandro Vaglio Pret, KLA Corp. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Anticipating high-volume manufacturing (HVM) of waveguide combiners, such as surface relief gratings (SRGs), we take what’s been learned from technology development and present metrology, inspection, and process control challenges to be overcome to achieve high yields in high volume production. Starting with incoming transparent substrates, defect inspection and wafer shape/flatness are critical to establish a quality starting point to minimize haze and pupil swim. Patterned substrates can include a variety of grating patterns making, and their fidelity impacts key metrics including MTF and contrast. Profilometry and scatterometry critical dimension (SCD) provide profile and high spatial coverage for many critical parameters such as grating height, width, residual layer thickness (RLT) and sidewall angle (SWA). Patterned waveguides also require defect inspection at high throughput to identify excursions and to screen eyepieces. Material properties including film thickness, n&k and elasticity are also critical, with unique challenges for nano-imprint lithography (NIL). Diced eye piece components in carrier require inspection of cracks, chips and edge treatment integrity.
PC12913-39
Author(s): James J. Watkins, Dae Eon Jung, Vincent Einck, Lucas Verrastro, Amir Arbabi, Univ. of Massachusetts Amherst (United States)
29 January 2024 • 8:20 AM - 8:40 AM PST | Moscone Center, Room 3006 (Level 3 West)
Show Abstract + Hide Abstract
We fabricate all-inorganic, high refractive index optics, including metalenses, waveguides, and diffractive optical elements via nanoimprint lithography with TiO2 nanoparticle dispersion inks and report full-wafer, high-throughput fabrication of waveguides and visible wavelength metalenses lenses with absolute efficiencies greater than 75% (>90% of design efficiency). We employ atomic layer deposition (ALD) as a post-imprint treatment that enables tuning of the refractive index from 1.9 to 2.25 using less than 20 cycles, which improves lens efficiency. Tuning RI of the imprinted optics to match that of the substrates removes concerns about residual layer thickness, resolving a critical issue for some applications. Additional cycles of ALD enable precise tuning of feature dimensions and feature spacings. Finally, we demonstrate the excellent optical and material stabilities of the all-inorganic imprinted optics.
PC12913-40
Author(s): Brian Bilenberg, Tobias Hedegaard Bro, Ankit Bisht, Srdjan Acimovic, Vladimir Miljkovic, NIL Technology ApS (Denmark)
29 January 2024 • 8:40 AM - 9:00 AM PST | Moscone Center, Room 3006 (Level 3 West)
Show Abstract + Hide Abstract
Blazed gratings are widely used for surface relief grating (SRG) waveguides for augmented reality displays. To increase the efficiency and design freedom of blazed gratings the control of the anti-blaze angle has gained attention lately. We will demonstrate mastering processes to realize blazed gratings with positive, vertical and negative anti-blaze angles on masters for the replication of SRG waveguides.
12913-41
Author(s): Friedrich-Karl Bruder, Johannes Frank, Sven Hansen, Mira Holzheimer, Alexander Lorenz, Christel Manecke, Covestro Deutschland AG (Germany); Jack Mills, Covestro LLC (United States); Lena Nault, Igor Pochorovski, Thomas Rölle, Brita Wewer, Covestro Deutschland AG (Germany)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Bayfol® HX photopolymer films prove themselves as easy-to-process, full color recording materials for volume holographic optical elements (vHOEs) and are available at industrial scale. Bayfol® HX is compatible to plastic processing techniques like thermoforming, film insert molding, and casting. See through applications such as, HMD and HUD, have demanding performance requirements on combiner and imaging technologies such as efficiency, optical function, and clarity. The properties of Bayfol® HX make it well suited to solve these challenges in primary display, and near-infrared imaging applications such as eye-tracking, while maintaining the requirements on optical performance. Using our novel free space Bragg matching technique introduced two years ago we demonstrate the recording of a complete set of grating couplers to achieve a 2-D eyebox expansion in a representative waveguide based near-eye display. In addition, using our novel NIR sensitized Bayfol® HX grade, we demonstrate a free space holographic grating combiner working at 940 nm but exposed at 850 nm.
PC12913-42
Author(s): Michael Weinstein, Samarth Bhargava, Applied Materials, Inc. (United States)
29 January 2024 • 9:20 AM - 9:40 AM PST | Moscone Center, Room 3006 (Level 3 West)
Show Abstract + Hide Abstract
To enable all-day outdoor-compatible smart glasses that wow consumers, waveguide displays cannot be limited by traditional performance trade-offs within the low-index waveguide design space. Waveguides displays must simultaneously achieve great cosmetics, no distracting rainbow artifacts, high efficiencies of >1000 nits/lm and excellent image quality. The Photonics Platforms Business from Applied Materials will present advances in design and materials co-optimization that enable a step-function improvement in display performance, leveraging the already proven and scalable 300mm wafer fabrication equipment platform.
12913-43
Author(s): Taigo Akasaki, Risa Tanaka, Takeshi Osaki, Toyo Gosei Co., Ltd. (Japan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Nanoimprint technology is one of options to produce Surface Relief Grating (SRG) waveguides today. The mold used in this process is known as a master mold, and these master molds are expensive to be used as stamps for production itself directly. Working stamps made by resins, a copy of the master molds, are used repeatedly to prevent damage to the master molds and reduce process costs. We evaluated the durability of working stamps through repeated using that influences the productivity of SRG waveguides. In this fabrication process, we examine the influence of the fabricated material types on the durability of working stamps. The result showed that there are no defects and significant deformation on the patterns of each materials after fabrication repeatedly more than 50 times using one working stamp respectively. This suggests that working stamps have capability to improve the productivity of SRG waveguides.
Coffee Break 10:00 AM - 10:30 AM
PC12913-126
Author(s): Peter Abbamonte, Univ. of Illinois (United States); Jonathan Manton, Subhalakshmi Kumar, Samuel Gleason, Cody Jensen, Nick Toombs, Inprentus, Inc. (United States)
29 January 2024 • 10:30 AM - 10:50 AM PST | Moscone Center, Room 3006 (Level 3 West)
Show Abstract + Hide Abstract
The oldest and most proven technique for fabricating diffraction gratings is mechanical ruling, which has been used for mass market manufacturing since the early 1960’s. In this talk I will give the latest update on Inprentus’ use of contact‐mode lithography for fabricating aperiodic grating‐coupled waveguides for AR. I will show data demonstrating 20‐picometer pitch uniformity, low stray light levels, aperiodic modulation results, and successful replication in a high‐index (n=1.9) resin using commercially available nanoimprint lithography.
12913-52
Author(s): Carlos Pina-Hernandez, Kaito Yamada, Adam Legacy, Keiko Munechika, HighRI Optics, Inc. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
For mass-fabrication of light-enabled devices, high refractive index optical polymers give attractive advantages in not only optical performance and manufacturability but also enable new applications that are not possible with the existing optical polymers. The availability of NIL-compatible high-refractive-index materials would be critical in supporting the growing photonics industry. HighRI Optics reports on the advancement and progress of the UV-curable 1.8 RI NIL materials without metal oxide fillers, making it the highest refractive index value amongst the filler-free & NIL-compatible materials. NIL performance using high-volume manufacturing tools, NIL repeatability, near-zero residual layer thickness (RLT), and environmental reliability validations are presented.
Session 11: Image Quality and Perception
29 January 2024 • 11:10 AM - 12:30 PM PST | Moscone Center, Room 3006 (Level 3 West)
Session Chair: Jannick P. Rolland, Univ. of Rochester (United States)
Session 11 will run concurrently with Sessions 2 and 6
12913-53
Author(s): Yong Fang, William J. Cassarly, Synopsys, Inc. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Non-sequential Monte Carlo ray trace is used to simulate light propagation through complex optical systems. When a ray is traced, the ray splits at surfaces and generates multiple out going ray segments. A large power threshold or a probabilistic split approach is used to limit the number of segments. These approaches reduce accuracy by either losing low power segments or reducing the number of segments in low power paths. In this paper we examine the use of sequential ray tracing to generate the stray light distributions. Sequences that allow us to define how rays are traced and what types of rays are propagated are discussed. Automatic sequence generation to setup the system for stray light analysis is explored. Multiple sequences are traced simultaneously without limiting the power threshold or using probabilistic split. In typical cases, we see greater than 10X speed improvements with no loss in accuracy. Examples illustrating typical use cases will be shown.
12913-54
Author(s): Jeremy Goodsell, Jannick Rolland, Daniel K. Nikolov, Nick Vamivakas, The Institute of Optics, Univ. of Rochester (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Waveguide displays are a popular architecture for optical see-through augmented reality devices because they provide a large eyebox while maintaining a compact form factor. However, they typically suffer from some drawbacks, including low optical efficiency and limited image quality, especially toward the edges of the field of view. Additionally, efficiency and image quality often work against each other in waveguide displays, requiring tradeoffs in the design process. In this presentation, we investigate how these parameters are impacted by the waveguide geometry and present tools for assisting waveguide designers to visualize the tradeoffs and assess the optimization.
12913-55
Author(s): Amir Sharghi, Wladi Beloglazov, Tobias Schneider, Günther Leschhorn, Thomas Limmer, Instrument Systems GmbH (Germany)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
VR devices widely use multiple near-infrared LEDs for eye tracking. However, NIR illumination is potentially harmful to the human eye and skin. The international photobiological safety standard IEC 62471 provides guidelines for evaluating the photobiological hazards of incoherent broadband light sources. According to this standard, a measurement set up was designed and the classification scheme was used to classify a VR module. The number of LEDs, their geometrical orientation, optical power and duration of the exposure and the proximity to the eye are among the important parameters that could increase the damage to the cornea, lens and retina. Our safety assessment was done on a VR module and opens up possibilities to extend this safety assessment for other eye tracking modules used in AR, VR and MR.
12913-130
Author(s): Nataliya Kosmyna, MIT Media Lab. (United States); Eugene Hauptmann, Reactive Lions Inc. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Very few research works featuring AR and Brain-Computer Interface (BCI) systems considered integrating cognitive state of the user in their applications. Additionally, even fewer projects explored the wearable, everyday solutions to help the user in their everyday tasks beyond the lab setup. In this project, we propose a wearable, wireless pair of glasses with a monocle-like, heads-up, single lens display by including information about the current attentional state of the wearer by adapting their environment accordingly. We first introduce an our solution for AR-BCI integration. An application was designed, which adapted to the user’s state of attention measured via electroencephalography (EEG) and electrooculography (EOG). The system only responded if the attentional orientation was classified as "internal". Fourteen users tested the attention-aware system; we show that the adaptation of the interface improved the usability of the system. We conclude that more systems would benefit from awareness of the user’s ongoing attentional state as well further efficient integration of AR and BCI headsets.
Break
Lunch Break 12:30 PM - 1:50 PM
Session 12: Metrology and Measurement Tools I
29 January 2024 • 1:50 PM - 2:50 PM PST | Moscone Center, Room 3006 (Level 3 West)
Session Chair: Hong Hua, Wyant College of Optical Sciences (United States)
Session 12 will run concurrently with Sessions 3 and 7
12913-44
Author(s): Thomas Kerst, Jesper Leppinen, Mikael Jokinen, OptoFidelity Oy (Finland)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
AR optical waveguide metrology traditionally necessitates co-positioning projection and imaging optics on the same waveguide side, thereby complicating system referencing. The current mechanical solutions, requiring physical relocation and rotation of imaging optics, are expensive and error-prone, increasing AR waveguide testing risks and costs. Our research introduces an innovative optical technique that eliminates such physical adjustments, leveraging existing tester hardware and targeted optical vignetting. This novel method introduces a cost-effective, reliable, and user-friendly AR metrology test station referencing system, drastically reducing cost and complexity. We illustrate that this new optical method provides referencing equal to current mechanical approaches. We also detail the expected cost reductions and system reliability and longevity improvements typically seen. This innovative technique has the potential to significantly simplify AR waveguide metrology, providing a significantly more cost-effective, risk-reduced method for quality control metrology.
12913-45
Author(s): Tobias Steinel, Sascha Reinhardt, Roland Schanz, Instrument Systems GmbH (Germany)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
The production of a Virtual Reality headset is a complex process with several steps. At each step, various measurements are taken to verify quality or to characterise the component for later use. The visual system is the key component of a VR headset and consists in general of a display and an optical lens, which form the optical module. In a first step, the properties like luminance and chromaticity of the display and optical lens are determined. In the second step, the properties like distortion of the optical module as combination of display and lens are measured. In the last step, the final VR headset is characterised for calibration and quality control. Exemplary measurements are shown for each step using imaging and spot light measurement devices.
PC12913-46
Author(s): Daniel Winters, Jan-Hinrich Eggers, TRIOPTICS GmbH (Germany)
29 January 2024 • 2:30 PM - 2:50 PM PST | Moscone Center, Room 3006 (Level 3 West)
Show Abstract + Hide Abstract
Optical system metrology is important for best product quality of XR headsets and their components. The test technology used to qualify these components and sub-modules originates from two different fields: display metrology and optical system test: Display-test-derived systems use widefield optics and test low-frequency "macro"-scale parameters like ISO contrast and color & brightness homogeneity. Because of design choices, these systems cannot test "micro"-scale higher-frequency features like contrast of projected text or chromatic aberrations. The typically non-diffraction limited optical designs do not allow reporting correct values for e.g. MTF because of physics limitations. Higher spatial frequencies corresponding to finer details ("micro"-scale) are the domain of optics testing-derived technology, however the design choices typically used here make testing at the "macro"-level difficult. In this paper, we demonstrate that neither one of the two technologies is sufficient to support the requirements of upcoming headset generations. Instead, we describe a new generation of test equipment that integrates both "micro"- and "macro"-scale test capabilities in one instrument.
Break
Coffee Break 2:50 PM - 3:10 PM
Session 13: Metrology and Measurement Tools II
29 January 2024 • 3:10 PM - 4:10 PM PST | Moscone Center, Room 3006 (Level 3 West)
Session Chair: Scott Carden, Magic Leap, Inc. (United States)
Session 13 will run concurrently with Sessions 4 and 8
12913-47
Author(s): Bobby Foote, Roque Martinez, Justin Tran, Carlos Alatorre, Mike Browne, Vision Products LLC (United States); Bruce Pixton, U.S. Army Night Vision & Electronic Sensors Directorate (United States); Rupal Varshneya, DEVCOM C5ISR (United States); Logan Williams, Air Force Materiel Command (United States); Charles Bullock, Steven Hadley, Marc Winterbottom, Air Force Materiel Command, U.S. Air Force (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
A new, deployable, ruggedized Helmet-Mounted Display Binocular Alignment measurement tool (HMD BAMT) was developed that provides accurate alignment measurements to 15 microns. The tool was developed for measurement of any HWD/HMD in use in the field. Validation of the tool was accomplished using a commercially available binocular HMD. The HMD was provided to the U.S. Army C5ISR Center Research and Technology Integration Directorate at Fort Belvoir, VA for additional validation against the Near-Eye Display Test Station (NDTS), a larger and more expensive system designed for laboratory measurement of HWDs. Results. Measurements showed the BAMT has very good accuracy and repeatability (i.e., milliradian accuracy). Cleared for public release (AFRL-2022-4336).
12913-48
Author(s): Thomas L. R. Davenport, William J. Cassarly, Blake Crowther, Synopsys, Inc. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
When designing camera systems, one important source of image degradation can come from stray light. This can result in ghost images caused by unwanted reflections off internal lens surfaces, or by out of field light being scattered by other surfaces. We demonstrate a method to characterize the glare spread function (GSF) of an optical system using ray trace simulations. Interpolated GSFs are applied to in-field image points as well as out of field points and then combined with the primary imaging path. The result accurately predicts the detector illuminance.
12913-127
Author(s): William J. Hall, Opto-Alignment Technology, Inc. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Multi-layer bonded structures play an important role in AR/VR systems. Parallelism between layers and the consistency of thin gaps is critical for the function of these devices. Opto-Alignment has developed a system that provides rapid, non-contact measurement of these types of structures. In this presentation, we will share measurements of components from a commercially available AR headset. Thickness maps of the different layers as well as overall profile data will be presented.
Optical Design Challenge
29 January 2024 • 4:30 PM - 5:20 PM PST | Moscone Center, Room 3006 (Level 3 West)
See students apply their creativity and university optics education to challenging, tangible industry specifications for today's immersive display products.
PC12913-801
29 January 2024 • 4:30 PM - 4:40 PM PST | Moscone Center, Room 3006 (Level 3 West)
12913-702
Author(s): Yuqiang Ding, Zhenyi Luo, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States); Garimagai Borjigin, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States), Univ. of Tsukuba (Japan), Japan Society for the Promotion of Science (Japan); Shin-Tson Wu, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Pancake lens has been widely used in mixed reality (MR) due to its compact formfactor. However, using a half mirror to fold the optical path results in a tremendous optical loss. To break this optical efficiency limit while keeping a compact formfactor, we present a new folded optical system incorporating a nonreciprocal polarization rotator. In our proof-of-concept experiment using a commercial Faraday rotator, the theoretically predicted 100% efficiency is validated. Meanwhile, the ghost images can be suppressed to undetectable level if the optics are with anti-reflection coating. Our novel pancake optical system holds great potential for revolutionizing next-generation MR displays.
PC12913-703
Author(s): Myeong-Ho Choi, Inha Univ. (Korea, Republic of)
29 January 2024 • 4:50 PM - 4:55 PM PST | Moscone Center, Room 3006 (Level 3 West)
PC12913-704
Author(s): Wan-Pin Tsai, National Taiwan Univ. of Science and Technology (Taiwan)
29 January 2024 • 4:55 PM - 5:00 PM PST | Moscone Center, Room 3006 (Level 3 West)
PC12913-705
Author(s): Zhenyi Luo, CREOL, The College of Optics and Photonics, Univ. of Central Florida (United States)
29 January 2024 • 5:00 PM - 5:05 PM PST | Moscone Center, Room 3006 (Level 3 West)
PC12913-706
Author(s): Tianyao Zhang, Wyant College of Optical Sciences (United States)
29 January 2024 • 5:05 PM - 5:10 PM PST | Moscone Center, Room 3006 (Level 3 West)
PC12913-802
29 January 2024 • 5:10 PM - 5:20 PM PST | Moscone Center, Room 3006 (Level 3 West)
Posters
29 January 2024 • 5:30 PM - 7:00 PM PST | Moscone Center, Lobby (Level 3 West)
Conference attendees are invited to attend the AR | VR | MR poster session on Monday evening. Come view the posters, enjoy light refreshments, ask questions, and network with colleagues in your field. Authors of poster papers will be present to answer questions concerning their papers. Attendees are required to wear their conference registration badges to the poster sessions.

Poster setup: Monday 10:00 AM - 4:30 PM
Poster authors: View poster presentation guidelines and set-up instructions.
12913-56
Author(s): Yuki Kawashima, Yuji Kamei, Topcon Technohouse Corp. (Japan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We perform 2D spectrometry measurements using the 2D spectroradiometer SR-5100HM with C-mount lens for various AR/VR display devices. For the C-mount lens attached to the SR-5100HM, select an objective lens that matches the viewing angle of each device. Specifically, AR devices use the lens with a 40° angle of view, and VR devices use the lens with an angle of view of 80°. From the results obtained by measuring each device, the luminance uniformity, chromaticity uniformity, contrast ratio, etc. defined by IEC63145-20-10 are calculated and the results are compared.
12913-57
Author(s): Yoshifumi Sudoh, Takemasa Tsutsui, Masahiro Itoh, Ricoh Co., Ltd. (Japan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We have developed augmented reality (AR) glasses that can be used outdoors for long periods of time. To enable this outdoor usage, both high light utilization efficiency and high transmittance are required. Polarizing beam splitter coatings with characteristics that correspond to the visible wavelength range are used to achieve 75% transmittance and 2% light utilization efficiency. In addition, it is necessary to reduce the burden on the wearer’s nose to allow them to use the device for extended periods of time. The combiner and the lenses are made from plastic, and an anamorphic optical system is used in which the position of the pupil of the incident optical system differs in two cross-sections oriented perpendicular to the optical axis. As a result, the weight of the entire optical system is only 20 g for both eyes, despite the full-color imaging capability.
12913-59
Author(s): Michael Cheng, Ansys Japan K.K. (Japan); Thibault Leportier, Alexandra Christophe, Jens Niegemann, Ansys Canada Ltd. (Canada)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
This paper explores the application of non-hexagonal 2D gratings in Exit pupil expander (EPE) based on surface relief grating on the waveguide. It explains the concept of non-orthogonal lattice vectors in Rigorous Coupled Wave Analysis, highlighting its important role for simulating non-hexagonal 2D gratings. An EPE design is then built with non-hexagonal 2D gratings. Their impacts on system performance and potential advantages are presented. Initial simulation results are presented and discussed. An optimization workflow, considering lattice vector angles as one of the variables, is established and performed. The optimized result is discussed.
12913-60
Author(s): Paulo Lima, Sebastien Pochon, David Pearson, Sung-Jin Cho, Oxford Instruments Plasma Technology Ltd. (United Kingdom)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
The roughness of sidewalls on slanted etched diffraction gratings needs to be minimized to avoid compromising their optical efficiency. The complication in performing these roughness measurements is to access the slanted sidewall by AFM tips on a typical etched feature with CD of 100nm, 200nm pitch and on a steep angle of 45deg. We present a simple method to measure the gratings’ sidewall roughness without any physical and/or chemical process and optical analysis needed before or after the measurement and which is also done on a standard AFM using standard AFM tips. Diffraction gratings were processed by an Oxford Instruments Plasma Technology’s Large (30cm) Ion Beam Etch Source. Patterned Silicon and SiO2 slanted sidewalls roughnesses were measured by an in-house conventional Asylum Research - Jupiter AFM allowing a speedy sidewall roughness assessment and development. Sidewall roughness measurement by a standard AFM in slanted diffraction gratings were demonstrated. Ion beam processed SiO2 diffraction gratings sidewall showed a typical roughness of 1.3nm.
12913-61
Author(s): Nien-Jung Chiang, National Taipei Univ. of Technology (Taiwan); Ting-Wei Huang, Bo-Kai Zhang, Ji-Ping Sheng, Wen-Chang Hung, ASUSTeK Computer Inc. (Taiwan); Yu-Chieh Cheng, National Taipei Univ. of Technology (Taiwan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Current augmented reality (AR) waveguide display technology faces a challenging trade-off between essential factors such as eyebox, efficiency, and image uniformity. Surface relief grating has been a commonly used exit pupil expander (EPE) in waveguides, but it exhibits limitations in efficiency and uniformity. To overcome these shortcomings, polarized volume grating (PVG) with ultra-wide angular bandwidth and high efficiency has been employed as an EPE to enhance efficiency and uniformity. However, the challenge arises when attempting to achieve high uniformity within the eyebox using nematic liquid crystal for controlling each expansion, as the polarization changes of diffracted light contribute to varying diffraction efficiencies, which affects the maintenance of uniformity across the entire field of view. This article introduces an innovative approach by incorporating high-low cover-layers on top of PVG to effectively control the path difference between light rays at various angles. This design modification mitigates unnecessary out-coupling and ensures a remarkably high level of uniformity in the entire AR system, providing a more efficient and uniform expanded eyebox.
12913-62
Author(s): Markus Zimmermann, Stephan Reichelt, Institut für Technische Optik, Univ. Stuttgart (Germany)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
The advantages of holographic displays over stereoscopic displays are that they provide both real motion parallax and solve the vergence-accommodation conflict without varifocal lenses and gaze tracking. For large holographic displays, pupil tracking and beam steering are required to ensure that the pupil is within the so-called viewing window or eye box between the diffraction maxima in the Fourier plane of the display system. We present a simulation study of the possibilities when the pupil is precisely tracked and can be considered in an iterative optimization process for hologram computation. Our study focuses on different initial phases and the resulting speckle noise.
12913-63
Author(s): Stefan Steiner, LightTrans International GmbH (Germany); Frederik Bachhuber, SCHOTT AG (Germany); Brian Bilenberg, NIL Technology ApS (Denmark); Jan Matthijs ter Meulen, Erhan Ercan, Mariana Ballottin, Morphotonics B.V. (Netherlands); Janne Simonen, Leo Peltomaa, Murat Deveci, OptoFidelity Oy (Finland)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Amidst the mixed news surrounding the feasibility of Augmented Reality (AR) smart glasses, the demand for commercially viable mass production of industry-standard optical waveguide combiners remains unwavering. Over the past two years, our consortium of companies has proposed a cost-effective and scalable manufacturing process for Surface Relief Grating (SRG) based waveguides, offering a comprehensive path from concept to fabrication through large-area nanoimprinting. This approach has garnered significant interest from both customers and partners associated with the participating companies. Our aim is to push beyond the established limits of large-area nanoimprinting. In this work we address the obstacles and latest advancements in maintaining imprint quality, fidelity and uniformity during large-area nanoimprinting. We demonstrate various building blocks that are crucial to manufacture high quality and cost-effective AR waveguides, such as the replication of slanted gratings and the possibility of low residual layer thickness using large-area nanoimprint lithography.
12913-64
Author(s): Yunyan Wang, Runhui Huang, Sean Simmons, Megan Bennett, Gu Xu, Brewer Science, Inc. (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
High-refractive-index organic coatings play a crucial role in driving innovation within the optical and photonic industries. We have developed three polymer platforms that demonstrate both a high refractive index (1.6-1.8) and exceptional transparency (>98%). These coatings can be efficiently applied to organic substrates through spin coating, offering a wide processing window that encompasses high-/low-temperature thermal curing as well as UV curing methods. Our research encompassed comprehensive investigations into the optical properties, thermal stability, and photo-imageability of these coatings.
12913-65
Author(s): Yun-Hsiang Chang, National Taipei Univ. of Technology (Taiwan); Chuan-Hui Liu, Tzu-Yao Lin, Shih-Chieh Yen, POLYVISIONS (Taiwan); Yu-Wei Cheng, Yu-Chieh Cheng, National Taipei Univ. of Technology (Taiwan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Metalens is optical components that can replace traditional multi-element lens designs, offering the advantages of aberration elimination, high numerical aperture, and wide field-of-view imaging capabilities. However, engineering nano-antennas within metalens, especially in wide-angle configurations at the millimeter scale, presents challenges. These nanostructures must effectively control phase and amplitude to meet the required phase for different angles of incidence, while also achieving focal spot sizes close to the diffraction limit. This study utilizes the Synopsys simulation tool, "MetaOptic Designer," to design and optimize a metalens with a 1mm diameter and a numerical aperture (NA) of 0.37, operating at a 940 nm wavelength. Through systematic optimization, the results demonstrate excellent imaging performance across multiple angles. At a spatial resolution of 50 cycles/mm, the modulation transfer function (MTF) contrast exceeds 0.35, indicating the metalens design's capability to achieve high-quality imaging across various angles.
PC12913-66
Author(s): Maryam Souri, Vincent Ip, Matthias Falmbigl, Meng Lee, Mark Campo, Robert Caldwell, Veeco Instruments Inc. (United States)
29 January 2024 • 5:30 PM - 7:00 PM PST | Moscone Center, Lobby (Level 3 West)
Show Abstract + Hide Abstract
Despite several competing technologies for Augmented Reality (AR) displays, surface relief gratings are amongst the most promising solutions. While Reactive Ion Etching (RIE), failed to produce slanted trenches due to the lack of directional control, Reactive Ion Beam Etching (RIBE) is a technique well suited to fabricate theses gratings via a combination of physical and chemical etching processes and substrate tilting. Along with several advantages of RIBE over RIE, such as enhanced control of the slant angle, achieving full wafer etching with superior uniformity for static, off-normal incidence process conditions have remained a challenge. Veeco’s latest generation IBE source technology with multi-zone electromagnets is a proven solution to fabricate highly uniform blanket and patterned 200 mm wafers under static, off-angle conditions. Combining this technology with RIBE/IBE, we have created uniform off-angle featured wafers such as slanted gratings with different slanted angles (0-60˚) which is a critical approach in the AR applications.
PC12913-67
Author(s): Murat Deveci, OptoFidelity Inc. (United States); Thomas Kerst, Roosa Mäkitalo, Janne Simonen, Pekka Laiho, OptoFidelity Oy (Finland)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Defects in master and replica waveguide gratings can cause image quality issues on AR displays. Characterizing these gratings is difficult due to their small features, which can be smaller than the wavelength of visible light. Microscopy is unsuitable for production testing as it lacks resolution and can be destructive. The authors propose an optical metrology setup using Littrow configuration to accurately measure diffraction grating pitch and orientation at a picometer and arcsecond scale to identify defects. The authors will also demonstrate the impact of grating analysis on the image quality of diffractive waveguides.
PC12913-69
Author(s): Marc A. Verschuuren, Rob Voorkamp, Jeroen Visser, Gert-Jan Hurxkens, SCIL Nanoimprint Solutions (Netherlands); Mohammad Ramezani, TeraNova B.V. (Netherlands)
29 January 2024 • 5:30 PM - 7:00 PM PST | Moscone Center, Lobby (Level 3 West)
Show Abstract + Hide Abstract
Applications such as diffractive waveguides and meta-lenses place challenging demands on the patterning methods and materials. The size and shape need to be reproducible to an absolute size with variations less than 1-2nm. Furthermore, the materials used need to have a high refractive index, preferably above n=2.0. In this contribution we will demonstrate the capabilities of our 150/200/300mm NIL cluster with integrated scatterometry based metrology for in-line inspection and quality control of the fabricated patterns. The direct replication of inorganic patterns in our NanoGlass sol-gel based materials, with an index of up to n=1.98, finds its application in diffractive optical elements, high aspect ratio pillars and waveguide combiners composed of slanted gratings and more. Furthermore, this method allows nano-patterns to be imprinted on both sides of the substrate, where the 2nd print is aligned directly towards the imprinted patterns on the backside, with an overlay accuracy below 1µm over a 300mm wafer.
12913-72
Author(s): Maria di Summa, Angelo Cardellicchio, Nicola Mosca, Sistemi e Tecnologie Industriali Intelligenti per il Manifattuiero Avanzato, Consiglio Nazionale delle Ricerche (Italy); Pietro Ferraro, Vittorio Bianco, Istituto di Scienze Applicate e Sistemi Intelligenti "Eduardo Caianiello", Consiglio Nazionale delle Ricerche (Italy); Ettore Stella, Sistemi e Tecnologie Industriali Intelligenti per il Manifattuiero Avanzato, Consiglio Nazionale delle Ricerche (Italy)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Maintenance in the railway context has today reached very high safety standards. Still, despite these high standards, the sector's goal remains to continue to use resources and technologies to achieve a total absence of accidents. Our study proposes creating an integrated monitoring system to support the awareness of a planning operator. The system consists of 3 blocks that collect data from the field, process them, and identify anomalies. Subsequently, these data are displayed interactively in a virtual environment that realistically reproduces the piece of railway line we are analyzing. The planning operator can navigate the virtual environment with extreme awareness and plan maintenance interventions. Finally, the prepared maintenance cards will be made available in augmented reality to support the maintainers in locating the intervention area and executing the task.
PC12913-73
Author(s): Eric Eisenberg, Erin Brown, Radiant Vision Systems, LLC (United States)
29 January 2024 • 5:30 PM - 7:00 PM PST | Moscone Center, Lobby (Level 3 West)
Show Abstract + Hide Abstract
Roughly 65% of people globally wear prescription lenses of some kind. To enable AR/MR devices to work for these wearers, makers of smart glasses must provide customizable optics that match each user’s unique prescription. Quality testing these devices is an additional challenge for manufacturers. An effective display measurement system must compensate for the vast range of vision permutations for near-sightedness, far-sightedness, and/or astigmatism. One approach to address this optical variability relies on mechanical methods such as reverse compensation optics or adding ‘glasses’ to the metrology system to counteract the effect of the prescription. But a mechanical solution can require multiple moving parts, lenses, and foreknowledge of the user’s prescription, along with the cost and increased system size of incorporating new hardware. This presentation will review current methods for measuring prescription AR/MR devices and present a new method for prescription compensation that does not depend on mechanical devices, and instead allows compatibility with a wide range of AR/MR device form factors and sizes.
12913-74
Author(s): Doga Cagdas Demirkan, Ava D. Segal, Abhidipta Mallik, Sebnem Duzgun, Andrew Petruska, Colorado School of Mines (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
Despite advancements in global positioning systems (GPS) and related sensor technologies, indoor navigation remains a significant challenge. Due to the unique nature of underground spaces, search and rescue operations necessitate specific assisted technologies for mapping, positioning, and navigation. Augmented reality (AR)-assisted navigation holds significant potential for enhancing search and rescue efforts during emergency evacuations in underground spaces, by providing situational awareness when visual perception is occluded by smoke or other particles. This study aimed to assess the performance of augmented reality (AR)-assisted navigation in an underground mine by projecting LIDAR and thermal camera information with the Microsoft Hololens to the users. Performance is evaluated by measuring the completion time to navigate into a no-light portion of the mine. The initial results showed AR assistance could provide invaluable support to first responders and significantly enhance search and rescue operations in emergency situations.
12913-75
Author(s): Yexin Pei, Gregory Nero, Tianyao Zhang, Jeff Chan, Xianyue Deng, Ted Lee, Parker Liu, Yuzuru Takashima, Wyant College of Optical Sciences (United States)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
We introduce a new concept, two-dimensional multi-domain field-of-view expansion , utilizing a pulsed RGB multi-color laser source and dual DLP3000 Digital Micro-mirror Devices (DMD) for high-resolution Near-to-Eye (NTE) displays. The system successfully expands the field of view (FOV) from 6x6 to 12x30 degrees, maintaining color fidelity across the enlarged FOV. The dual DMD, performing both illumination and diffractive beam steering overcomes traditional resolution and FOV constraints. Our compact design holds a potential for wearable tech, AR, and VR applications, offering a new FOV expansion solution for the next generation of NTE display technologies.
PC12913-76
Author(s): Sonika Obheroi, Josh Lucas, Richard Austin, Minh Tong, Gamma Scientific (United States)
29 January 2024 • 5:30 PM - 7:00 PM PST | Moscone Center, Lobby (Level 3 West)
Show Abstract + Hide Abstract
Explore the significant impact of alignment on the optical performance data and resulting technical specifications of augmented reality (AR) and virtual reality (VR) devices providing insights into the considerations necessary for optimal alignment strategies. The relevance of true eye geometry is highlighted in considering pupil rotation in instantaneous/wide field of view cameras compared to true eye rotation of the human eye as represented by an active gaze tracking and imaging platform.
PC12913-79
Author(s): Kelsey Wooley, Eulitha US, Inc. (United States); Andrew M. C. Dawes, Synopsys, Inc. (United States); Zhixin Wang, Eulitha US, Inc. (United States); Maryvonne Chalony, Synopsys, Inc. (United States); Harun Solak, Eulitha US, Inc. (United States); Lawrence S. Melvin, Synopsys, Inc. (United States)
29 January 2024 • 5:30 PM - 7:00 PM PST | Moscone Center, Lobby (Level 3 West)
Show Abstract + Hide Abstract
Displacement Talbot Lithography (DTL) has emerged as a viable technology that relies on the proven photolithography approach of the semiconductor industry while offering a specific, low-cost solution for large area printing of periodic structures of the kind required on waveguides. Electronic Design Automation (EDA) tools are necessary for standard projection lithography approaches, and now can be used to understand DTL interactions with waveguide designs. Here, we present the first results from the development of a complete design and optimization approach that facilitates fabrication of waveguide devices with optimum processing conditions and targeted device performance.
PC12913-80
Author(s): Hung-Shan Chen, Chia-Ming Chang, Ming-Syuan Chen, Guo-Lin Hu, Chien-Chung Chen, Sung-Nan Chen, Yi Hung, Liqxtal Technology Inc. (Taiwan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
In this paper, we present world-first TFT liquid crystal glasses (TFT-LCG) that display images to the passersby and have limited obstruction to the vision of wearers. By the concept of manipulating the polarization state of incident light with the liquid crystal, the appearance at the world side of the glasses is electrically controlled. The glasses display colored images with resolution of 36 pixels per inch and contrast ratio over 15. The viewing angle of wearer is almost the same as existing glasses and keep high MTF performance with TFT pixel traces. The driving power of the TFT-LCG is lower than 100mW with frame rate of 10 fps. Integrated with virtual platform and wireless control, the glasses can serve as a novel interaction medium between individuals. It opens a new path for how people connect with each other.
12913-81
Author(s): Pengfei Li, Biaoli Tao, Ze Yuan, Yongjiang Lab. (China); Jiayan Zhuang, Ningbo Institute of Industrial Technology (China); Chaohao Wang, Shuang Wang, Yongjiang Lab. (China)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
This paper proposes a no-reference image quality evaluation model for accurately assessing the quality of real-world images displayed on head-mounted display (HMD) devices. The proposed model employs a simulation of human visual system, providing a reliable measure of image quality. Initially, an efficient convolutional neural network (CNN), specifically designed for noise characteristics, is utilized to obtain a near-perfectly noise-reduced image. The difference between this image and the target image is then calculated in the linear domain. To emulate the contrast sensitivity and masking effects inherent in the human visual system, we introduce a sophisticated frequency-domain filter model in a uniform color space. The resulting multidimensional data from the filters are aggregated and corrected based on the average brightness. Our model's performance is validated against Peak Signal-to-Noise Ratio (PSNR) and Structural Similarity Index (SSIM) metrics using the TID2013 dataset, revealing superior correlation coefficients.
PC12913-82
Author(s): Yi-Ming Li, Tsung-Xian Lee, Wan-Pin Tsai, Yang-Kuan Tseng, National Taiwan Univ. of Science and Technology (Taiwan); Ching-Cherng Sun, National Central Univ. (Taiwan)
29 January 2024 • 5:30 PM - 7:00 PM PST | Moscone Center, Lobby (Level 3 West)
Show Abstract + Hide Abstract
This study explores the intricacies of scattered light phenomena within near-eye Augmented Reality (AR) systems employing Volume Holographic Optical Elements (VHOEs) as waveguide combiners. We employ a comprehensive hybrid optical simulation methodology that amalgamates ray and wave optics principles to dissect the potential origins of scattered light. Our investigation culminates in the discernment that while VHOEs exhibit specific scattering attributes, the predominant source of scattered light emanates from the contrast ratio of the micro-display panel. Consequently, the paramount pursuit lies in enhancing the contrast ratio to mitigate the effects of scattered light and enhance the overall quality of the AR user experience.
12913-129
Author(s): Yoshihiko Hirai, Yusei Kunitou, Masaaki Yasuda, Osaka Metropolitan Univ. (Japan)
On demand | Presented live 29 January 2024
Show Abstract + Hide Abstract
In order to fabricate waveguides for AR glasses, it is promising to apply a diffraction grating with a tilted structure to achieve a higher viewing angle. Nanoimprinting is promising as a low-cost, mass-production method. However, it is difficult to release the tilted gratings without defects for both right and left sides at the same time. Here, we will discuss various mold release methods such as peeling or role to role releasing. We investigate various mold release methods by computer simulation and discuss about optimum release process that minimizes defects.
Conference Chair
Meta (United States)
Conference Chair
College of Optical Sciences, The Univ. of Arizona (United States)
Conference Chair
Univ. of Rochester (United States)
Program Committee
Magic Leap, Inc. (United States)
Program Committee
Vision Products LLC (United States)
Program Committee
Magic Leap, Inc. (United States)
Program Committee
Facebook Technologies, LLC (United States)
Program Committee
Microsoft Research Cambridge (United Kingdom)
Program Committee
National Yang Ming Chiao Tung Univ. (Taiwan)
Program Committee
Sony Group Corp. (Japan)
Program Committee
Samsung Display America Lab (United States)
Program Committee
The Univ. of Hong Kong (Hong Kong, China)
Program Committee
The Institute of Optics (United States)
Program Committee
Meta (United States)
Program Committee
Apple Inc. (United States)
Program Committee
Meta (United States)
Program Committee
Stanford Univ. (United States)