Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Hand-eye Coordination for Public Display Interaction
This project will first perform a simple eye gaze estimation method, and then combine the hand gesture detection results, finally operate the UI elements on a public display at the long-distance.
Keywords: Appearance-based gaze estimation, Hand pose estimation, UI element control
Human eye gaze and hand gesture are both the strong cue for human attention and intension which can be used as the input signals for interactive systems. Unfortunately, current gaze estimation methods could not output accurate gaze information for fine-level selection, and hand gesture along is not informative for interaction. Previous gaze estimation heavily relies on the commercial eye-tracker which has limited operation distance.
This project will first perform a simple eye gaze estimation method, and then combine the hand gesture detection results, finally operate the UI elements on a public display at the long-distance. The final interactive system should be enabled robustly detect which element the user chose and perform the desired operation on the element on the display. We are going to use the latest Azure Kinect sensor as the single sensor which provides a long operation distance.
For this project, we are aiming to submit the paper to the top-tier conference. Please send your application only if you are interested in publication. Thank you.
Human eye gaze and hand gesture are both the strong cue for human attention and intension which can be used as the input signals for interactive systems. Unfortunately, current gaze estimation methods could not output accurate gaze information for fine-level selection, and hand gesture along is not informative for interaction. Previous gaze estimation heavily relies on the commercial eye-tracker which has limited operation distance.
This project will first perform a simple eye gaze estimation method, and then combine the hand gesture detection results, finally operate the UI elements on a public display at the long-distance. The final interactive system should be enabled robustly detect which element the user chose and perform the desired operation on the element on the display. We are going to use the latest Azure Kinect sensor as the single sensor which provides a long operation distance.
For this project, we are aiming to submit the paper to the top-tier conference. Please send your application only if you are interested in publication. Thank you.
Implement an interactive system with both the human eye gaze and hand pose. Submitting a paper to a top-tier conference.
Implement an interactive system with both the human eye gaze and hand pose. Submitting a paper to a top-tier conference.