Social media: Instagram, Facebook, Twitter

Doctoral Thesis

Detecting Salient Emotional States During Real-World Social Media Use Using Smartphone Sensors

Emotions are integral to the Social Networking Service (SNS) user experience; we express our feelings, react to the posted content, and communicate with emojis. It has been reported that SNS users can experience many positive benefits (e.g. attenuate loneliness) from consuming content on SNS in a variety of media formats including text status updates, messages, images, and videos. However, the emotional qualities of SNS can also evoke negative emotions which may lead to further negative outcomes and behaviors such as cyberbullying, and trolling. Detecting the emotions of SNS users may enable responding to and mitigating these problems. Prior work suggests detecting the emotions may be achievable on smartphones: emotions can be detected via built-in sensors during prolonged input tasks.

In this dissertation, I extend these ideas to SNS context featuring sparse input interleaved with more passive browsing and media consumption activities. I show that using smartphone sensors to detect SNS users’ emotional states has several advantages.

1. Does not rely on private user messages and posts

2. Does not require large data sets

3. Applicable to live analysis of passive activities 

In this dissertation, I claim that smartphone sensor data can be used to detect salient emotional states during real-world social media use.

social media

I provide evidence to support this claim by presenting four studies. In the first, I elicit participant’s emotions using validated image and video stimuli and capture sensor data from a mobile device, including data from a novel passive sensor: its built-in eye-tracker. Using this data, I construct machine learning models that predict self-reported binary affect, achieving 93.20% peak accuracy. The second study extends these results to a more ecologically valid scenario in which participants browse their own social media feeds. The study yields high accuracies for both self-reported binary valence (94.16%) and arousal (92.28%).

In the third study, I move beyond the settings in the prior studies to real-world social media settings. The study aims to increase understanding of the emotional states that are spread across SNS in order to establish a system to detect them. Using Experience Sampling Method (ESM), I conduct a 14-day study to quantify the emotional experiences of SNS users after using the service multiple times a day. The study reveals a list of salient emotional states that include appearance comparison and envy (e.g., when participants view contents about others traveling). Based on these results, a second ESM study examines the feasibility and the viability to detect the salient emotional states in real-world social media use using smartphones. Using the sensor data, I construct machine learning models that predict self-reported binary appearance comparison and envy, achieving 90.88% and 91.28% peak accuracy, respectively.

I argue that results from these studies indicate that smartphone sensor data can be used to detect salient emotional states during real-world social media use. I conclude with design considerations for future researchers, designers, and developers for the development of an emotion detection system on SNS using smartphone sensors.

Credits: Photos by Katka Pavlickova, dole777 on Unsplash