Assessing Sentiment of the Expressed Stance on Social Media

08/08/2019
by   Abeer Aldayel, et al.
0

Stance detection is the task of inferring viewpoint towards a given topic or entity either being supportive or opposing. One may express a viewpoint towards a topic by using positive or negative language. This paper examines how the stance is being expressed in social media according to the sentiment polarity. There has been a noticeable misconception of the similarity between the stance and sentiment when it comes to viewpoint discovery, where negative sentiment is assumed to mean against stance, and positive sentiment means in-favour stance. To analyze the relation between stance and sentiment, we construct a new dataset with four topics and examine how people express their viewpoint with regards these topics. We validate our results by carrying a further analysis of the popular stance benchmark SemEval stance dataset. Our analyses reveal that sentiment and stance are not highly aligned, and hence the simple sentiment polarity cannot be used solely to denote a stance toward a given topic.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 5

09/05/2019

A Discussion on Influence of Newspaper Headlines on Social Media

Newspaper headlines contribute severely and have an influence on the soc...
05/03/2021

Explaining Outcomes of Multi-Party Dialogues using Causal Learning

Multi-party dialogues are common in enterprise social media on technical...
08/14/2016

Viewpoint and Topic Modeling of Current Events

There are multiple sides to every story, and while statistical topic mod...
09/21/2017

Analyzing users' sentiment towards popular consumer industries and brands on Twitter

Social media serves as a unified platform for users to express their tho...
01/25/2021

Adversarial Learning of Poisson Factorisation Model for Gauging Brand Sentiment in User Reviews

In this paper, we propose the Brand-Topic Model (BTM) which aims to dete...
12/22/2021

Multimodal Analysis of memes for sentiment extraction

Memes are one of the most ubiquitous forms of social media communication...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

The stance can be defined as the expression of the individual’s standpoint toward a proposition [biber1988adverbial]. Detecting the stance towards an event is a sophisticated process where various factors play a role in discovering the viewpoint, including personal and social aspects. Most of the studies in this area have focused on using the textual elements of user’s posts such as sentiment of the text to infer the stance [somasundaran2010recognizing, elfardy2016cu, ebrahimi_joint_2016]. While the goal of the stance detection is to determine the favorability towards a given entity or topic [mohammad_stance_2017], sentiment analysis aims to determine whether the emotional state of a given text is positive, negative, or neutral [liu2010sentiment]. There is a rich body of research where the sentiment has been used solely to discover the viewpoints towards an event [lee2018using, overbey2017linkin, unankard2014predicting, tsolmon2012extracting]. These studies expected that the sentiment polarity could indicate the stance. However, another line of research develops a stance specific model to infer the viewpoints where sentiment is being neglected [kareem_2019, darwish_improved_2017, trabelsi2018unsupervised]. As the dependence on sentiment as a sole factor for the stance prediction has been found to be suboptimal, which might indicate a weak relation between sentiment and stance [mohammad_stance_2017, elfardy2016cu].

Accordingly, it becomes important to examine the relation between the sentiment and the stance for viewpoint discovery toward an event. This leads us to pose the following research questions:

  • RQ1: Can sentiment polarity be used to capture the stance towards an event?

  • RQ2: How does sentiment align with stance? When does positive/negative sentiment indicate support/against stance?

These questions aim to identify whether the sentiment can substitute the stance by studying the polarity nature of the expressed stance. In other words, this study examines whether the supporting/opposing stances can be identified with positive/negative sentiment. To answer these questions, we used the SemEval stance dataset [mohammad_semeval-2016_2016], the popular stance dataset that contains sentiment and stance labels. To further validate the results, we constructed a new stance detection dataset that has about 6000 tweets towards four topics and annotated with gold labels for sentiment and stance. This dataset contains the parent tweets along with reply tweets, which provides contextualized information for the annotator and helps in judging the sentiment and stance of the reply tweets. After that, we analyze the datasets to determine the degree of the correlation between sentiment polarity and the gold label stance.

2 Related work

In the literature, sentiment has been widely used either to infer the public opinion or as a factor to help in detecting the stance towards an event. The next sections illustrate these cases with a focus on studying the stance towards an event where the simple sentiment has been used either by using a sentiment lexicon or the textual polarity of the text.

2.1 Sentiment as stance

Sentiment has been used interchangeably with stance to indicate the viewpoint detection [park2011politics, hu2013listening, smith2017analyzing, lee2018using, unankard2014predicting, tsolmon2012extracting, agarwal2018geospatial]. In these studies the sentiment polarity has been used purely as the only factor to detect the viewpoint towards various events in social media. For instance, the work of [smith2017analyzing] used sentiment to investigate the opinion towards the terrorist attack in Paris, during November 2015. They used annotators from Crowdflower to label the sentiment (negative, positive or neutral) as expressed in the tweet and used these labels as a way to analyse the public reaction toward Paris attack in 2015. In a study done by [park2011politics], they used the sentiment to discover the political leaning of the commenter on news articles. In their study, a sentiment profile constructed for each commenter to help in tracking their polarity toward a political party. For instance, a liberal commenter uses negative comments in conservative articles and positive comments to liberal articles.

A more recent study by [lee2018using] used the sentiment to examine the opinions following the release of James Comey’s letter to Congress before the 2016 US presidential election day. The previous study categorized 25 most common hashtags with sentiment polarity towards Hillary Clinton and Trump. Furthermore, the work of [unankard2014predicting] used sentiment to analyze the political preferences of the users for the 2013 Australian federal election event. For the sentiment they recruited three annotators to label the tweet with a polarity score (positive, negative or neutral). In their study they used aspect-level sentiment for predicting user’s political preference and they overlooked the cases where the sentiment is negative and the stance is expressing a support viewpoint.

Another study [tsolmon2012extracting] developed an opinion score equation based on sentiment lexicon and frequency of a term to infer the users’ opinions towards events as they extracted from the timeline. In addition, the work of [hu2013listening] designed topic-sentiment matrix to infer the crowd’s opinion. Another recent study by [agarwal2018geospatial] used AFINN-111 dictionary for sentiment analysis and used sentiment polarity as an indication of the opinion towards Brexit. All of the above studies treated sentiment as the indicator of the stance toward the event of the analysis.

2.2 Sentiment as proxy for stance

Another line of research used sentiment as a feature to predict the stance [somasundaran2010recognizing, elfardy2016cu, ebrahimi_joint_2016, mohammad_stance_2017]. In the popular SemEval stance dataset [mohammad_semeval-2016_2016], the tweets are labeled with sentiment and stance to provide a public benchmark to evaluate the stance detection systems. In their work, they showed that using sentiment features are useful for stance classification when they combined with other features and not used alone. The work of [ebrahimi_joint_2016] used an undirected graphical model that leverages interactions between sentiment and the target of stance to predict the stance. Also, the work of [somasundaran2010recognizing]

developed a stance classifier that used sentiment and arguing expressions by using sentiment lexicon along with arguing lexicon which outperforms Uni-gram features system. In

[igarashi_tohoku_2016] they used SentiWordNet to produce sentiment for each word and use the sentiment value along with other features to predict the stance in SemEval stance dataset and compared with CNN stance model. They found that feature based model performed better in detecting stance. The work of [krejzl_uwb_2016] used surface-level, sentiment and domain-specific features to predict the stance on SemEval stance dataset. Overall, the use of sentiment in conjunction with other features helps in predicting the stance but not as the only dependent feature.

The work of [mohammad_stance_2017, sobhani2016detecting] studied the extent to which the sentiment is correlated with the stance in the sense of enhancing stance classifier. The main focus of the previous study was to investigate the best features for the stance classification model. In their work, they concluded that sentiment might be beneficial for stance classification, but when it is combined with other factors.

This study investigates another dimensionality of the sentiment-stance relation with focus on gauging the alignment between sentiment and stance by analysing in depth the relation of how the stance is being expressed in conjunction with the sentiment.

SemEval stance # CD stance #
Atheism (A) 733 Antisemitic (AS) 1050
Climate Change is Concern (CC) 564 Gender (G) 1050
Feminist Movement (FM) 949 Immigration (I) 3174
Hillary Clinton (HC) 934 LGBTQ (L) 1050
Legalization of Abortion (LA) 883
Total 4063 Total 6324
Table 1: Number of tweets for each topic.

3 Data collection

We study the sentiment nature in the expressed stance. To accomplish this, we used SemEval stance dataset which contains about 4000 tweets on five topics, including Atheism (A), Climate Change (CC), the Feminist Movement (FM), Hillary Clinton (HC) and the Legalisation of Abortion (LA). Furthermore, we designed a context-dependent (CD) stance dataset that contains 6324 reply tweets covering four controversial topics: Antisemitic (AS), Gender (G), Immigration (I), LGBTQ (L). Table 1 shows the distribution of the tweets with respect to each topic. In this dataset, each tweet has been annotated by five annotators using Figure-eight platform 111https://figure-eight.com/, and the label with a majority vote is assigned. We used the same annotations guideline of SemEval stance dataset [mohammad_semeval-2016_2016]. Since CD dataset is all reply tweets, the parent tweet along with reply tweet has been provided to the annotators to understand the context of the conversation to better judge the sentiment and stance.

4 Methodology

4.1 Analysis of the correlation patterns

To get a good insight of how the stance is being expressed, we first analyze the distribution of stance and sentiment on the topic level. Figures 1 a and b, illustrate the stance and sentiment distribution in the SemEval stance dataset and CD stance dataset, respectively. Overall the negative sentiment constitutes the major polarity of the most topics. This reveals the tendency of using negative sentiments to express a viewpoint in a controversial topic. It can be observed that for the climate change the supporting stance constitutes about 59%; however the overall tweets with negative sentiment constitute 50%. Furthermore, 30% of the LGBTQ tweets show negative sentiment, while only 7% of the tweets express the opposing stance. From these numbers, it is clear that sentiment does not simply represent stance.

(a) SemEval stance
(b) CD stance
Figure 1: The distribution of sentiment and stance with respect to each topic.

Figure 2 illustrates the sentiment distribution over the stance in the two datasets. The graphs show that the negative sentiment constitutes the major polarity over the Favor and Against stances. As the negative sentiment represents over 56% and 54% of the supporting stance in the SemEval and CD stance datasets, respectively. These results reveal the tendency of using negative sentiments to express a viewpoint towards a controversial topic.

(a) SemEval stance
(b) CD stance
Figure 2: Distribution of sentiment per a given stance.

Table 2 shows some examples where the sentiment does not reflect the stance. Examples 1 and 2 show tweets with an opposing viewpoint to targets, while using positive sentiment. Examples 3 and 4 show the opposite situation, where the expressed stance is supporting, while the sentiment is negative.

These results show that sentiment fails to detect the real stance toward a topic. There is a clear mismatching between the negative/positive sentiment and the supporting/against stance. Even with the dominance of the negative sentiment in most of the topics, yet the overall stance has shown a mixer of support viewpoint.

# Tweet Target Sent. Stance
1 Life is our first and most basic human right. LA + Against
2 @realDonaldTrump Thank you for protecting our border I + Against
3 The biggest terror threat in the World is climate change #drought #floods CC - Favor
4 In the big picture, religion is bad for society because it blunts reason. #freethinker A - Favor
Table 2: Differences between sentiment and stance. Targets: Legalization of Abortion (LA), Immigration (I), Atheism (A), Climate Change (CC).

5 Discussion

Our first research question concerns with whether the sentiment captures the real stance, can be answered with dissenting. The previous analysis shows that the sentiment cannot substitute the stance in general. The words choice gap exists for in-favor stance and positive sentiment (Appendix A). Subsequently, We noticed that sentiment has failed to discover the public opinion towards most of the topics in the two datasets. Hence, using the sentiment polarity as the only factor to predict the public opinion potentially leads to misleading results. The result of the mismatch between in-favor and positive stance was sizable. The positive sentiment failed to distinguish the supporter viewpoints.

As for the overall alignment between the sentiment and stance, there is a noticeable disparity between sentiment and stance for a given topic. In general, the sentiment tends to be negative in the expressed stance as a way to rebuttal or defend the viewpoint and show support or opposing stance. The negative sentiment could help in discovering some of the against stances, but it will be mixed with a proportion of the supporter viewpoints.

In summary, our analysis in this paper illustrates the sophisticated nature of stance detection and that it cannot be simply captured using the sentiment polarity. This finding is crucial, especially when assessing the credibility of results in studies that used sentiment to measure public support of a given topic on social media.

6 Conclusion

In this paper, we study the relation between the sentiment and the stance. To gauge the extent of this relation, we constructed a new stance dataset with gold sentiment and stance labels. Then we conducted a textual and quantitative analysis of the expressed stance with respect to the sentiment polarity. Our study provides evidence that sentiment cannot substitute the stance. As a final consideration, researcher should be more cautious when it comes to identifying the viewpoints toward an event and to take into account the clear difference between the sentiment and the stance. As using sentiment purely overshadows the real stance and leads to truncated results.

References

Appendix 0.A Analysis of the textual patterns

To gauge the similarity between the vocabulary choice that has been used to express the sentiment and stance we analyzed the tweets in the two datasets using Jaccard similarity. We used Jaccard coefficient the widely adopted measure to capture the overlap between two sets [an2019political, achananuparp2008evaluation, gomaa2013survey]. In this analysis, for each sentiment and stance gold labels we combine all tweets and use Term Frequency-Inverse Document (TF-IDF), to find important words in each type of sentiment and stance. In order to compute the TF-IDF on tweet level we consider each tweet as document. Using TF-IDF helps in filtering out less significant words. The Jaccard similarity between the set of sentiment and stance words defined as following:

(1)

Where W and W denote the list of top N words by TF-IDF value for the tweets with specific sentiment and stance type.

(a) Semeval stance
(b) CD stance
Figure 3: Jaccard similarity of the top N-most frequent words between sentiment and stance.

Fig 3 shows that the similarity between the words that have been used to express favor stance has less than 20% of similarity with tweets that has a positive sentiment. That means users tend to express their Favor stance without using positive sentiment words. In contrast, the common words for against stance have the most significant similarity with against sentiment words. The Jacquard similarity become stable with growing N. As Fig 4 shows that the overall agreement between the sentiment and the stance is minuscule in general. The tweets that have against-negative labels constitutes less than 33%. Similarly less than 8% of the data has positive sentiment and favor stance. This shows that in general negative words tend to be similar to the against words while the matching cases are minuscule. On the other-hand, the matching cases where the tweet express favor and positive sentiment constitute about 8.9% and 4% of the overall data of SemEval stance and CD stance dataset.

(a) SemEval stance
(b) CD stance
Figure 4: Tweets with matching and mixed stance and sentiment.