Examining the Emotional Content of Tweets


A Q&A with a UCIPT Postdoctoral Researcher

What is the single most relaxing tweet, from a sentiment-analysis point of view?

There are many different methods that we can use to conduct sentiment analysis. The basic idea of sentiment analysis is to categorize a tweet as positive, neutral, or negative. A tweet with a positive sentiment would say something such as, “I am having an awesome day”; an example of a neutral tweet would say something such as, “I ate a sandwich”; and an example of a negative tweet would say something such as, “I hate everything that is going on right now.” Relaxing tweets are mostly likely to contain positive sentiment. One of the ways to calculate the degree or magnitude of positive sentiment tweets is to analyze each of the words in a tweet. Researchers have come up with a list of about 10,000 words (Language Assessment by Mechanical Turk [LabMT]) that are most commonly used on Twitter. These words are ranked from 1 (sad/negative) to 9 (happy/positive). For example, laugh is one of the happiest words on the list, with a score of 8.22, while death is one of the saddest, with a score of 1.25. Neutral words like the tend to score in the middle of the scale. Based on this, we are able to calculate the magnitude of positive sentiment and assign a value to each tweet. So, using these definitions, the most relaxing tweet might contain several words that are defined as positive.

What is the single most stressful tweet, from a sentiment-analysis point of view?

The process is similar for “stressful” tweets. That is, a stressful tweet may contain several words that are defined as negative using the LabMT dictionary. However, it is important to recognize that this method has limitations. For example, “I am extremely stressed out” has a greater negative sentiment compared with “I am stressed out.” The method described above calculates the sentiment based on each individual word in a tweet and does not consider adverbs (“extremely”) or the meaning of compound constructions (“extremely stressed out”). Furthermore, this method does not take sarcasm into consideration. We are currently working on Deep Learning Models that can potentially overcome these challenges.

Does the stress response to a tweet change based on inclusion of a graphic or hyperlink?

I think the inclusion of graphics and hyperlinks may change a person’s stress response to a tweet. However, analyzing the meaning of graphics and the content of the hyperlink can be challenging, as researchers would need to conduct image analyses and review the contents of a webpage. This will be an important area of research as these graphics and hyperlinks can provide additional information about a tweet’s meaning and a person’s stress response.

Is there a specific UCLA student handle that you notice garnering a specific amount of emotional response to campus events?

No, we have not identified a UCLA student Twitter handle that is garnering a specific amount of emotional response to campus events. When we analyze Twitter data, we use data from many students so that we can get an overall representation of what students are saying. However, pinpointing a specific Twitter user may help determine whether we are biased in our sampling strategy and identify results that are not generalizable to the general population.

With the data you have collected so far, would it be possible for a company to tailor their messages and sell more effectively to students?

It is possible to design targeted messages using the data we have collected. However, we have not focused our research on tailoring messages to sell more effectively to students. Twitter currently offers a service that helps better target your ads. Nevertheless, in our content analysis study, we were focused on the number of times that students discuss academic studies, personal health, and social activities throughout the semester. We found that the proportion of academic-related tweets increased, while tweets related to social life decreased during exam periods versus non-exam periods. Ethnicity also influenced the types of tweet content. Specifically, during exam periods, African-American students tweeted less content related to academic studies compared with Caucasian students. We could apply the methods used in our content analysis to determine students’ interests, so that messages can be better tailored to specific populations.

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *