Register now After registration you will be able to apply for this opportunity online.
This opportunity is not published. No applications will be accepted.
Multi-Person Interaction Capture in the Interactive Design Lab
In this master thesis project, the student is expected to build a system to capture interactions between people and the environment. From the hardware perspective, it is expected to exploit the multimedia system of the Interactive Design Lab, in which RGB cameras, depth sensors, microphone, and other devices have been installed. From the algorithm perspective, it is expected that the student can design a multi-person motion capture method that is able to maximize the performance of the hardware system.
Keywords: multimedia capture system, human interaction capture.
In our daily lives, people are communicating with each other and interacting with the environment, and hence the behavior of a person is influenced by other people and the surroundings. To understand such influence, it is essential to reliably capture human behavior in a multi-modal manner, e.g. synchronized RGB videos, depth maps, point clouds, and audio. The captured data can serve diverse tasks, e.g. creating intelligent virtual avatars in metaverse, optimizing architecture design for people communication, novel view synthesis in volumetric videos, and so on.
Therefore, we propose this master thesis to build the foundation. To this end, we will employ the Interactive Design Lab (IDL) as the environment, which possesses state-of-the-art architecture designs and multimedia capturing systems. In addition, we will provide a lightweight parametric body model, which is specially designed for human motion modeling. We expect the master student to complete the following tasks:
1. Develop a synchronized multimedia capturing system, based on the existing hardware in IDL. The sensors will cover RGB cameras, depth sensors, audio recorders, and others.
2. Develop a multi-person interaction capture algorithm, based on our existing lightweight parametric body model. In particular, the motion is captured from the multiview RGB cameras.
3. (bonus) learn a generative model to synthesize people with interactions, based on the environment and the audio.
Feel free to check Design++ (https://designplusplus.ethz.ch/) as well as more information on the IDL (https://idl.ethz.ch/).
The starting time is as early as possible.
In our daily lives, people are communicating with each other and interacting with the environment, and hence the behavior of a person is influenced by other people and the surroundings. To understand such influence, it is essential to reliably capture human behavior in a multi-modal manner, e.g. synchronized RGB videos, depth maps, point clouds, and audio. The captured data can serve diverse tasks, e.g. creating intelligent virtual avatars in metaverse, optimizing architecture design for people communication, novel view synthesis in volumetric videos, and so on.
Therefore, we propose this master thesis to build the foundation. To this end, we will employ the Interactive Design Lab (IDL) as the environment, which possesses state-of-the-art architecture designs and multimedia capturing systems. In addition, we will provide a lightweight parametric body model, which is specially designed for human motion modeling. We expect the master student to complete the following tasks: 1. Develop a synchronized multimedia capturing system, based on the existing hardware in IDL. The sensors will cover RGB cameras, depth sensors, audio recorders, and others. 2. Develop a multi-person interaction capture algorithm, based on our existing lightweight parametric body model. In particular, the motion is captured from the multiview RGB cameras. 3. (bonus) learn a generative model to synthesize people with interactions, based on the environment and the audio.
Feel free to check Design++ (https://designplusplus.ethz.ch/) as well as more information on the IDL (https://idl.ethz.ch/).
The starting time is as early as possible.
Not specified
Dr. Yan Zhang, yan.zhang@inf.ethz.ch
Dr. Michael Kraus, kraus@ibk.baug.ethz.ch
Dr. Yan Zhang, yan.zhang@inf.ethz.ch Dr. Michael Kraus, kraus@ibk.baug.ethz.ch