Related Projects
You may also be interested in related projects from our group on holographic near-eye displays:
- S. Choi et al. “Neural 3D Holography: Learning Accurate Wave Propagation Models for 3D Holographic Virtual and Augmented Reality Displays”, ACM SIGGRAPH Asia 2021 (link)
- S. Choi et al. “Michelson Holography”, Optica, 2021 (link)
- N. Padmanaban et al. “Holographic Near-Eye Displays Based on Overlap-Add Stereograms”, ACM SIGGRAPH Asia 2019 (link)
and other next-generation near-eye display and wearable technology:
- R. Konrad et al. “Gaze-contingent Ocular Parallax Rendering for Virtual Reality”, ACM Transactions on Graphics 2020 (link)
- B. Krajancich et al. “Optimizing Depth Perception in Virtual and Augmented Reality through Gaze-contingent Stereo Rendering”, ACM SIGGRAPH Asia 2020 (link)
- B. Krajancich et al. “Factored Occlusion: Single Spatial Light Modulator Occlusion-capable Optical See-through Augmented Reality Display”, IEEE TVCG, 2020 (link)
- N. Padmanaban et al. “Autofocals: Evaluating Gaze-Contingent Eyeglasses for Presbyopes”, Science Advances 2019 (link)
- K. Rathinavel et al. “Varifocal Occlusion-Capable Optical See-through Augmented Reality Display based on Focus-tunable Optics”, IEEE TVCG 2019 (link)
- N. Padmanaban et al. “Optimizing virtual reality for all users through gaze-contingent and adaptive focus displays”, PNAS 2017 (link)
- R. Konrad et al. “Accommodation-invariant Computational Near-eye Displays”, ACM SIGGRAPH 2017 (link)
- R. Konrad et al. “Novel Optical Configurations for Virtual Reality: Evaluating User Preference and Performance with Focus-tunable and Monovision Near-eye Displays”, ACM SIGCHI 2016 (link)
- F.C. Huang et al. “The Light Field Stereoscope: Immersive Computer Graphics via Factored Near-Eye Light Field Display with Focus Cues”, ACM SIGGRAPH 2015 (link)
Acknowledgements
We would like to thank Julien Martel for help with the camera calibration. Suyeon Choi was supported by a Kwanjeong Scholarship and a Korea Government Scholarship. This project was further supported by Ford, NSF (awards 1553333 and 1839974), a Sloan Fellowship, an Okawa Research Grant, and a PECASE by the ARO.
Disclaimer on DPAC
The results of the double-phase amplitude coding (DPAC) approach we show are inspired by, but not representative of, Maimone et al.’s SIGGRAPH 2017 paper. We use our own implementation of their algorithm and run our SLM using phase values in the range [0,2π] whereas they used a range of [0,3π]. Also, their SLM was likely better calibrated than ours. This resulted in a worse image quality of their method reported here compared to their original work. Yet, all CGH methods, including DPAC and neural holography, use the same experimental settings for captured results and are thus directly comparable. Moreover, you can find extensive simulations that are not affected by the range of phase values or SLM calibration in our paper and supplement.