Misinformation and the desire to go viral – Essay


In December of 2023, Statistics Canada released a report stating that compared to three years ago, 43% of Canadians are finding it harder to differentiate between true and false information on the internet. The result of this is an increased level of distrust in the information that individuals see in their chosen news sources. With “Misinformation/Trust in media” being one of the indicators in the Canadian government’s quality of life framework, it may be safe to conclude that the spread of misinformation and disinformation is a threat on our humanity – yet, the issue becomes more challenging to combat as technology sophisticates.

Social media: evaluating platform engagement and its role in the spread of misinformation

While I believe that the spread of misinformation is a byproduct of many factors, such as an perceived increase in difficulty in fact-checking information, or an increase in people interacting with algorithms that facilitate incorrect perceptions of the real world, my interest lies in discussing the habits that are formed around the current way that social media platforms facilitate user engagement, and its relationship with the spread of misinformation.

Those who are involved with the creation, dissemination, and curation of digital content should not only be aware of the consequences of the content they create, but should be mindful of the medium and presentation of their content. Once released into the world, we no longer have control over how our content is interacted with; however, by understanding the prevalent ways in which content is spread across the internet, digital content creators can avoid becoming complicit in social media systems where accuracy of information is regarded as a lesser priority than the engagement potential of a post.

The general social media usage cycle

Across platforms such as Facebook, X (formerly Twitter), and TikTok, cited by CTV as the three websites where sharing of misinformation is the most prevalent, the high-level structure of a social media platform is similar in the following steps:

  1. Posts, content are uploaded or shared by a user.
  2. Other users will view this content. These users may interact with said content in forms which could include, but would not be limited to, likes, shares, or comments.
  3. The uploader or sharer of the content will be able to measure the amount of people reached through their posts by receiving data on the amount of likes, shares, and other forms of engagement.
People just like the reward of likes, comments, and shares

In January of 2023, University of Southern California released a research report stating that patterned engagement such as the above make people susceptible to sharing information with the pure objective of getting engagement from others. By measuring the performance of posts through ‘likes’ and ‘shares’, the user is rewarded for their engagement through a tangible measurement of how many people their posts have reached and resonated with. 

In regards to misinformation, it can be speculated that because people respond to content that elicits an emotional response from them faster, as this correlates to users getting more frequent attention from the online public. Furthermore, because people are on social media to find information that is new to them or challenges their perspective, information that often goes viral is misinformation and disinformation. For those who share content, the prize for sharing eye-catching information is the attention that ensues.

Rewarding people for accuracy, not engagement

Facebook, X, and TikTok are all platforms that are designed with shorter, more frequent feedback loops than the likes of other websites, such as news platforms – all designed to keep your attention towards the app for as long as possible. This reveals a misalignment with the goal of preventing the spread of misinformation: people are obtaining and sharing news and other informational content from platforms that were never designed with prioritizing content accuracy. Thus, this creates a flaw in a society where citizens are seeking accuracy in information on mediums that were designed for engagement and short-form feedback.

One suggestion to combat this system, while it may not align with company goals, is to alter the speeds in which people share content after perceiving certain content. If the rapid spread of misinformation can be attributed to habit-building, then perhaps a reverse can be established – where social media is used to build habits around measuring the accuracy of content before sharing.

Wikipedia may be observed as an example of a website where a community is rewarded for information accuracy. As an extension of their identity as an “online encyclopaedia”, Contrary to social media websites, as stated in their five principles, Wikipedia’s priority is to facilitate the characterization of information and issues, rather than driving user interactions through debate and discussion. Yet, their modes of community engagement are similar to social media platforms in that “Wikipedians” are able to send each other “thumbs-up”s, badges, and editing awards, through a concrete prioritizing of their values and an engagement model that encourages, Wikipedia has been successful in facilitating a community of users that see value in the incentivization of information accuracy over information virality.


In understanding the factors that drive misinformation, it is important to acknowledge that a part of it is not the information itself, but the medium in which the information is shared through that plays a role. By observing the current state of user engagement habits in the spread of misinformation, and contrasting it with possible reward structures that could be implemented in its place, we may be able to understand how a social media that incentivizes information accuracy may be constructed, as a measure to combat the spread of misinformation.