Register now After registration you will be able to apply for this opportunity online.
Depth Perception of Virtual and Physical Objects for VST- and OST-HMD
This study investigates depth perception with virtual and real objects using video see-through (VST) and optical see-through (OST) head-mounted displays (HMDs). By comparing devices like Meta Quest 3, Pico 4, and HoloLens 2, the research explores how humans perceive spatial depth in mixed reality (MR) scenarios. Through Unity-based application development and user studies, the work evaluates depth perception differences and provides insights for advancing MR technology.
Your work will start with a literature research on latest research on depth perception. Next, you will become acquainted with the MR platform “Unity”. Then, you will formulate a set of hopytheses and start designing a MR scene with virtual objects. Then you will define an application in different hardware platforms. You will conduct a user study to test your hypotheses. You need to compare these applications with perception of real objects. Finally, you summarize your findings in a written report, and present them in an intermediate and final presentation.
Your work will start with a literature research on latest research on depth perception. Next, you will become acquainted with the MR platform “Unity”. Then, you will formulate a set of hopytheses and start designing a MR scene with virtual objects. Then you will define an application in different hardware platforms. You will conduct a user study to test your hypotheses. You need to compare these applications with perception of real objects. Finally, you summarize your findings in a written report, and present them in an intermediate and final presentation.
The goal of this thesis is to conduct a comparative human factors study using various VST-HMDs, such as the Meta Quest 3 (optionally the Apple Vision Pro), Pico 4 (monoscopic VST), and OST-HMD, namely HoloLens 2. The study will involve developing multiple straightforward application builds in Unity to evaluate and compare the depth perception capabilities of these devices.
The goal of this thesis is to conduct a comparative human factors study using various VST-HMDs, such as the Meta Quest 3 (optionally the Apple Vision Pro), Pico 4 (monoscopic VST), and OST-HMD, namely HoloLens 2. The study will involve developing multiple straightforward application builds in Unity to evaluate and compare the depth perception capabilities of these devices.