Predicting the light spectrum of virtual reality scenarios for Non-Image-Forming visual evaluation

Yitong Sun*, Hanchun Wang, Pinar Satilmis, Narges Pourshahrokh, Carlo Harvey, Ali Asadipour

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    Virtual reality (VR) headsets, while providing realistic simulated environments, are also over-stimulating the human eye, particularly for the Non-Image-Forming (NIF) visual system. Therefore, it is crucial to predict the spectrum emitted by the VR headset and to perform light stimulation evaluations during the virtual environment construction phase. We propose a framework for spectrum prediction of VR scenes only by importing a pre-acquired optical profile of the VR headset. It is successively converted into “Five Photoreceptors Radiation Efficacy” (FPRE) maps and the “Melanopic Equivalent Daylight Illuminance” (M-EDI) value to visually predict the detailed stimulation of virtual scenes to the human eye.
    Original languageEnglish
    Title of host publicationIEEE Conference on Virtual Reality and 3D User Interfaces
    PublisherInstitute of Electrical and Electronics Engineers Inc.
    Pages791-792
    Number of pages2
    DOIs
    Publication statusPublished (VoR) - 25 Mar 2023

    Fingerprint

    Dive into the research topics of 'Predicting the light spectrum of virtual reality scenarios for Non-Image-Forming visual evaluation'. Together they form a unique fingerprint.

    Cite this