Elsevier

Computers in Human Behavior

Volume 51, Part B, October 2015, Pages 1198-1204
Computers in Human Behavior

Collective attention in the age of (mis)information

https://doi.org/10.1016/j.chb.2015.01.024Get rights and content

Highlights

  • How 2.3 Facebook users consumed different information.

  • Qualitatively different information is consumed in a similar way.

  • Users more prone to interact with false claims are usually exposed to conspiracy rumors.

Abstract

In this work we study, on a sample of 2.3 million individuals, how Facebook users consumed different information at the edge of political discussion and news during the last Italian electoral competition. Pages are categorized, according to their topics and the communities of interests they pertain to, in (a) alternative information sources (diffusing topics that are neglected by science and main stream media); (b) online political activism; and (c) main stream media. We show that attention patterns are similar despite the different qualitative nature of the information, meaning that unsubstantiated claims (mainly conspiracy theories) reverberate for as long as other information. Finally, we classify users according to their interaction patterns among the different topics and measure how they responded to the injection of 2788 false information. Our analysis reveals that users which are prominently interacting with conspiracists information sources are more prone to interact with intentional false claims.

Introduction

The quantitative understanding of social dynamics allowed by the unprecedented availability of digital traces is far from trivial (Conte et al., 2012, Lazer et al., 2009). The growth of knowledge fostered by an interconnected world together with the unprecedented acceleration of the scientific progress has exposed the society to an increasing level of complexity to explain reality and its phenomena. Meanwhile, a shift of paradigm in the production and fruition of contents has occurred changing the quality of information.

Indeed, on the Web everyone can access and produce a variety of contents actively participating in the creation, diffusion and reinforcement of worldviews. Furthermore, such a large availability of user provided contents fostered massive recruitment of people around common interests, worldviews and narratives and thus affecting the evolution of the public opinion.

Conspiracy theories, in particular, find on the Internet a natural media for their diffusion and, not rarely, trigger collective counter-conspirational actions (Atran and Ginges, 2012, Lewandowsky et al., 2013). Narratives grounded on conspiracy theories tend to reduce the complexity of reality and are able to contain the uncertainty they generate (Byford, 2011, Hogg and Blaylock, 2011, Fine et al., 2005). They can create a climate of mistrust or lead to disengagement from mainstream society or from officially recommended practices (Bauer, 1997).

We do not claim that conspiracy theories are all false, however, due to their nature of uncertainty containers are based on partial evidence, intuitions and often are the results of association rather then deduction.

In this respect, conspiracists tend to explain significant social or political aspects as plots conceived by powerful individuals or organizations (Sunstein & Vermeule, 2009). As these kind of arguments can sometimes involve the rejection of science, alternative explanations are invoked to replace the scientific evidence. For instance, people who reject the link between HIV and AIDS generally believe that AIDS was created by the U.S. Government to control the African American population (Bogart and Thorburn, 2005, Kalichman, 2009). Since unsubstantiated claims are proliferating over the Internet, what could happen if they were used as the basis for policy making? What about their potential effect on the public opinion?

The role of the socio-technical system in enforcing informed debates and their effects on the public opinion still remain unclear. However, The World Economic Forum, in its 2013 report (Howell, 2013), has listed the “massive digital misinformation” as one of the main risks for the modern society. People perceptions, knowledge, beliefs, and opinions about the world and its evolution get (in)formed and modulated through the information they can access, most of which coming from newspapers, television (Mccombs & Shaw, 1972), and, more recently, the Internet. The World Wide Web, more specifically social networks and micro-blogging platforms, have changed the way we can pursue intellectual growth or shape ideas. In particular, large social networks, with their user-provided content, have been facilitating the study of how the economy of attention leads to specific patterns for the emergence, production, and consumption of information (Dow et al., 2013, Lanham, 2007, Qazvinian et al., 2011).

Despite the enthusiastic rhetoric about the ways in which new technologies have burst the interest in debating political or social relevant issues (Bekkers et al., 2011, Crespi, 1997, Garcia et al., 2012, Gonzalez-Bailon et al., 2011, Guillory et al., 2011, Lippmann, 1946), the role of the socio-technical system in enforcing informed debates and their effects on the public opinion still remain unclear. Indeed, the emergence of knowledge from this process has been dubbed collective intelligence (Shum et al., 2012, Levy, 1999, Malone and Klein, 2007, Shadbolt et al., 1987), although we have become increasingly aware of the presence of unsubstantiated or untruthful rumors. Mainly driven by audience purpose often false information is particularly pervasive on social media, fostering sometimes a sort of collective credulity.

A multitude of mechanisms animate the flow and acceptance of false rumors (Kuklinski, Quirk, Jerit, Schwieder, & Rich, 2000), which in turn create false beliefs that are rarely corrected once adopted by an individual (Ayers and Reder, 1998, Garrett and Weeks, 2013, Koriat et al., 2000, Meade and Roediger, 2002). The process of acceptance of a claim (whether documented or not) may be altered by normative social influence or by the coherence with the individual system of beliefs (Frenda et al., 2011, Zhu et al., 2010). On the other hand, basic questions remain on how the quality of (mis)information affects the economy of attention processes, concerning, for example, the A large body of literature addresses the study of social dynamics on socio-technical systems (Adamic and Glance, 2005, Friggeri et al., 2014, Hannak et al., 2014, Kleinberg, 2013, Lewis et al., 2012, Mocanu et al., 2013, Onnela and Reed-Tsochas, 2010, Ugander et al., 2012).

We observed that information-based community are aggregated around shared narratives and that the debates among them contribute to the emergence of the proliferation of political pages and alternative information sources with the aim to exploit the Internet peculiarities to organize and convey the public discontent (with respect to the crisis and the decisions of the national government).

Furthermore, we noticed the emergence of very distinct groups, namely trolls, building Facebook pages as a parodistic imitation of both alternative information sources and online political activism. Their activities range from controversial comments and posting satirical content mimicking alternative news sources, to the fabrication of purely fictitious statements, heavily unrealistic and sarcastic. Not rarely, these memes became massively diffused and were used as evidence in online debates from political activists (Ambrosetti, 2013).

Inspired by these lively and controversial social dynamics at the edge between virality and credulity, we addressed the quantitative analysis of the interlink between information sources (conspiracists news and mains news) and political discussion on the web. In particular, we want to understand the selection criteria of users mostly exposed to unsubstantiated claims.

We will first introduce our methodology of categorizing the Facebook pages, by taking into account their self-description as well as the type of content they promote. We concentrate on alternative news sources, online political activism, and also on all the national mainstream news journals that we could find to have an active page on Facebook. In the following sections, through thorough quantitative analysis, we show that the attention patterns when faced with various contents are similar despite the different qualitative nature of the information, meaning that unsubstantiated claims reverberate as long as other, more verified, information. Finally, we measure how the social ecosystem responded to the perturbation of false information injected by trolls. We find that a dominant fraction of the users interacting with the troll memes is the one composed of users preeminently interacting with alternative information sources – and thus more exposed to unsubstantiated claims. Consumers of alternative news, which are the users trying to avoid the main stream media ‘mass-manipulation’, are the most responsive to the injection of false claims.

Section snippets

Ethics statement

The data are publicly available as they come from a public online social site (Facebook). However, any information has been analyzed anonymously and in aggregated form. The entire data collection process has been performed exclusively with the Facebook Graph API (Facebook, 2013), which is publicly available and for the analysis (according to the specification settings of the API) we used only public available data (users with privacy restrictions are not included in the dataset). The pages from

Data collection

The debate around relevant social issues spreads and persists over the web, leading to the emergence of unprecedented social phenomena such as the massive recruitment of people around common interests, ideas or political visions. Disentangling the many factors behind the influence of information sources on social perception is far from trivial. Specific knowledge about the cultural and social context (even if online) in which they manifest is fundamental. Hence, inspired by the success of

Attention patterns

Our analysis starts by providing an outline of users’ attention patterns with respect to different information coming from distinct sources – i.e., alternative news, main stream media and political activism. As a first measure, we count the number of interactions (comments, likes, or likes to comments) by users and plot the cumulative distribution function (CDF) of the users’ activity on the various page categories in Fig. 1 CDF shows that user interactions with posts on all different types of

Conclusions

Conspiracists generally tend to explain a significant social or political aspect as a secret plot by powerful individuals or organizations (Sunstein & Vermeule, June 2009) and their activity is proliferating over the web. This study provides a genuine outline of the online social dynamics and, in particular, on the effect of Facebook on bursting the diffusion of false beliefs when truthful and untruthful rumors coexist. Several cultures coexist, each one competing for the attention of users.

Additional information

The authors declare no competing financial interests.

Acknowledgments

Funding for this work was provided by the authors’ institutions (IMT Lucca Institute for Advanced Studies, Northeastern University), EU FET project MULTIPLEX No. 317532. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

We want to thank Alessandro Vespignani, Rosaria Conte, Mario Paolucci, Santo Fortunato, Brian Keegan, Piotr Sapiezynski and Gianni Riotta for useful discussions Special thanks go to Dino Ballerini, Elio

References (45)

  • B. Zhu et al.

    Individual differences in false memory from misinformation: Personality characteristics and their interactions with cognitive abilities

    Personality and Individual Differences

    (2010)
  • Adamic, L. & Glance, N. (2005). The political blogosphere and the 2004 U.S. election: Divided they blog. In LinkKDD 05:...
  • Ambrosetti, G. (2013). I forconi: il senato ha approvato una legge per i parlamentari in crisi. chi non verr rieletto,...
  • S. Atran et al.

    Religious and sacred imperatives in human conflict

    Science

    (2012)
  • M. Ayers et al.

    A theoretical review of the misinformation effect: Predictions from an activation-based memory model

    Psychonomic Bulletin & Review

    (1998)
  • M. Bauer

    Resistance to new technology: Nuclear power, information technology and biotechnology

    (1997)
  • V. Bekkers et al.

    New media, micromobilization, and political agenda setting: Crossover effects in political mobilization and media usage

    The Information Society

    (2011)
  • L.M. Bogart et al.

    Are HIV/AIDS conspiracy beliefs a barrier to HIV prevention among African Americans?

    JAIDS Journal of Acquired Immune Deficiency Syndromes

    (2005)
  • J. Byford

    Conspiracy theories: A critical introduction

    (2011)
  • R. Conte et al.

    Manifesto of computational social science

    European Physical Journal Special Topics EPJST

    (2012)
  • I. Crespi

    The public opinion process. How the people speak

    (1997)
  • P.A. Dow et al.
    (2013)
  • Facebook (2013). Using the graph api. Website, 8 2013. Last checked...
  • Fine, G. A., Campion-Vincent, V., & Heath, C. (2005). Rumor mills: The social impact of rumor and legend. Social...
  • B. Franks et al.

    Conspiracy theories as quasi-religious mentality: An integrated account from cognitive science, social representations theory, and frame theory

    Frontiers in Psychology

    (2013)
  • S.J. Frenda et al.

    Current issues and advances in misinformation research

    Current Directions in Psychological Science

    (2011)
  • Friggeri, A., Adamic, L., Eckles, D., & Cheng, J. (2014). Rumor cascades. In Proceedings of the 8th international AAAI...
  • Garcia, D., Mendez, F., Serdült, U., & Schweitzer, F. (2012). Political polarization and popularity in online...
  • Garrett, R. K. & Weeks, B. E. (2013). The promise and peril of real-time corrections to political misperceptions. In...
  • Gonzalez-Bailon, S., Borge-Holthoefer, J., Rivero, A., & Moreno, Y. (2011). The dynamics of protest recruitment through...
  • Guillory, J., Spiegel, J., Drislane, M., Weiss, B., Donner, W., & Hancock, J. (2011). Upset now? Emotion contagion in...
  • Hannak, A., Margolin, D., Keegan, B., & Weber, I. (2014). Get back! You don’t know me like that: The social mediation...
  • Cited by (0)

    View full text