Automatic emotion capture when viewing web-based media on a smartphone

Matthew Rhys Jones


Supervised by Dave Marshall; Moderated by Yukun Lai

Every day, we view different types of web-based media such as tweets, online videos, and articles. Each of these types of media can produce an emotional response such as anger, sadness, or surprise. Automatic classification of these responses has received a lot of attention in the image and video analysis communities, as well as in social computing.

Following on from a CUROP project that produced a prototype face tracker and emotion classifier, this project aims to further develop a more robust face tracker and emotion classifier, and then to apply these to real-world scenarios. The resultant data will then be analysed, to further understand how people respond to different types of media.

Initial Plan (28/01/2016) [Zip Archive]

Final Report (04/05/2016) [Zip Archive]

Publication Form