Dal Alert!

Receive alerts from Dalhousie by text message.

X

MCSc Thesis Defence - USING TOUCH GESTURES TO RECORD AND RECOGNIZE EMOTIONS

Who:    Nabil Bin Hannan

Title:    USING TOUCH GESTURES TO RECORD AND RECOGNIZE EMOTIONS

Examining Committee:

Dr. Derek Reilly - Faculty of Computer Science (Supervisor)
Dr. Kirstie Hawkey - Faculty of Computer Science (Reader)
Dr. Srinivas Sampali - Faculty of Computer Science (Reader)

Chair:    Dr. Raza Abidi - Faculty of Computer Science

Abstract:

In this thesis, we explore how people use touchscreens to express emotional intensity, and whether these intensities can be understood by oneself at a later date or by others. We conducted a four-week participatory design activity to improve the design of JogChalker, a system that allows recreational runners to record their emotional state while running using touchscreen gestures. The recreational runners who participated in design also used the JogChalker prototype. Results indicated a desire for more expressiveness when gestures are recorded. A controlled study was then conducted in a lab environment where we asked 26 participants to express a set of emotions mapped to predefined gestures, at range of different intensities. Participants could choose the way they expressed emotional intensity, using one or more of touch pressure, position, width, and speed. One week later, participants were asked to identify the emotional intensity visualized in animations of the gestures ma
de by themselves and by other participants. Results indicate that gestural attributes can be used to encode emotional intensity in a manner that can be later recognized by oneself and others. Our participants expressed emotional intensity using gesture length, pressure, and speed primarily. Gesture position was not used in general, and gesture width was not used independently from touch pressure. The choice of factors was impacted by the specific emotion, and the range and rate of increase of these factors varied by individual and by emotion. Recognition accuracy of emotional intensity was higher at extreme ends, and was higher for one's own gestures than those made by others. Participants who were consistent in how they expressed emotional intensity were more likely to correctly recognize emotional intensity of both their own gestures and those made by others. The factors of size and pressure (mapped to colour in the animation) were most readily interpreted across partici
pants, while speed was more difficult to differentiate. We discuss implications for developers of annotation systems and other touchscreen interfaces that wish to capture affect.

Time

Location

Room 211, Goldberg Computer Science Building