Awareness of driver emotions facilitates intelligent automobile applications that improve driver comfort, well-being and safety. Automotive manufacturers such as Mercedes have developed tools to improve the emotional state of a driver via built-in comfort systems such as massage, fragrance, music, ionisation of the air in the cabin. Ridesharing companies (such as Uber, Didi) are improving quality of service by monitoring the mental status of their drivers. Recognition of driver emotion is gaining momentum in both academia and industries.
Keywords: Machine Learning, CAN data, Intelligent Vehicle, Affective Computing, Human Computer Interaction
The European New Car Assessment Programme (Euro NCAP) requires that the driver monitoring camera should be a mandatory part of next generation vehicles. Monitoring driver states is essential for (semi-) autonomous driving because system must know whether a driver is able to take over the control of vehicle in complicated environment.Another important function of driver monitoring camera is the recognition of emotion status of drivers since the facial expression has close relation with emotion. However, in-vehicle is a special environment where high cognitive load is required and drivers seldom elicit emotions. To verify the reliability of facial expression based emotion recognition in vehicle, we collected driver facial videos, vehicle front-view videos, driving data (CAN data) as well as emotional states of drivers in the wild. To the best of our knowledge, our dataset is the most comprehensive in-the-wild dataset that contains driver emotion.
The European New Car Assessment Programme (Euro NCAP) requires that the driver monitoring camera should be a mandatory part of next generation vehicles. Monitoring driver states is essential for (semi-) autonomous driving because system must know whether a driver is able to take over the control of vehicle in complicated environment.Another important function of driver monitoring camera is the recognition of emotion status of drivers since the facial expression has close relation with emotion. However, in-vehicle is a special environment where high cognitive load is required and drivers seldom elicit emotions. To verify the reliability of facial expression based emotion recognition in vehicle, we collected driver facial videos, vehicle front-view videos, driving data (CAN data) as well as emotional states of drivers in the wild. To the best of our knowledge, our dataset is the most comprehensive in-the-wild dataset that contains driver emotion.
The goal of this thesis can be framed as the following research questions:
- RQ1: are overall facial expressions (detected by state-of-the-art facial expression recognition algorithm) over a trip identical to a driver’s true emotion of that trip?
[detailed explanation: if facial expression recognition algorithm always detects for example smile / anger, does that definitely imply a driver is happy / angry? Could it be the case that a driver also shows unhappy face but he / she is actually happy?]
- RQ2: in/during what kind of situation/manoeuvre, drivers are most likely to elicit facial expression? in what situation, the elicited facial expression are most in-line with the driver’s true emotion?
[detailed explanation: a driver may always show stressed facial expression when he makes sharp turns / hard brakes, but overall he is relaxed. Therefore, to predict a driver’s overall emotion, should we exclude such moments when facial expressions are not in-line with a driver’s true emotion?]
- RQ3: To which extend can emotion recognition be improved based on the fusion of drivers’ facial expressions and additional in-vehicle sensor modalities (CAN, front-view video)?
[detailed explanation: driver emotions are also closely related to traffic context and their behaviour. For example, hard brakes or encountering other aggressive traffic participants often lead to negative emotions such as anger or hostile; therefore, by the fusion of driving data (i.e. CAN data) and traffic context (from vehicle front view video), we expect to improve emotion recognition methods that are solely based on facial expression. ]
The goal of this thesis can be framed as the following research questions:
- RQ1: are overall facial expressions (detected by state-of-the-art facial expression recognition algorithm) over a trip identical to a driver’s true emotion of that trip? [detailed explanation: if facial expression recognition algorithm always detects for example smile / anger, does that definitely imply a driver is happy / angry? Could it be the case that a driver also shows unhappy face but he / she is actually happy?] - RQ2: in/during what kind of situation/manoeuvre, drivers are most likely to elicit facial expression? in what situation, the elicited facial expression are most in-line with the driver’s true emotion? [detailed explanation: a driver may always show stressed facial expression when he makes sharp turns / hard brakes, but overall he is relaxed. Therefore, to predict a driver’s overall emotion, should we exclude such moments when facial expressions are not in-line with a driver’s true emotion?]
- RQ3: To which extend can emotion recognition be improved based on the fusion of drivers’ facial expressions and additional in-vehicle sensor modalities (CAN, front-view video)? [detailed explanation: driver emotions are also closely related to traffic context and their behaviour. For example, hard brakes or encountering other aggressive traffic participants often lead to negative emotions such as anger or hostile; therefore, by the fusion of driving data (i.e. CAN data) and traffic context (from vehicle front view video), we expect to improve emotion recognition methods that are solely based on facial expression. ]
Shu Liu (liush@ethz.ch)
Please contact us with your CV, a short statement of motivation, and your current transcripts of records (bachelor & master).
Shu Liu (liush@ethz.ch)
Please contact us with your CV, a short statement of motivation, and your current transcripts of records (bachelor & master).