How do social media algorithms affect users’ interactions with one another on these sites?

by Carson Weekley

 

Throughout the twenty-first century, the world has seen a technological revolution with the rise of the internet and the global connectivity that came with it. Along with these technologies came new demand for people to interact with one another, and multiple new companies stepped in to fill this demand: Facebook, Twitter, Instagram, TikTok, Reddit, and so on—what we now know as social media. These companies created platforms for people around the world to interact with one another, all using similar formats wherein a user creates a post that can be seen, shared, liked, and commented on by millions of other people who also use the platform. Describing social media sites to someone who has never seen them before may create the impression that these platforms are breeding ground for constructive discussions about the important issues that our population faces in today’s world. However, this could not be further from the truth. These platforms are run by algorithms that control the experience of their users, which ends up creating echo chambers and polarization that prohibit productive dialogue.

The fact that these are “social” platforms allows for a number of problems to arise, starting with the higher stakes experience for the users. As Nicholas Carr points out, “Because we’re often using our computers in a social context… our social standing is, in one way or another, always in play, always at risk. The resulting self-consciousness—even, at times, fear—magnifies the intensity of our involvement with the medium.” However, it’s not just the fact that the users are on high alert because of self-consciousness. The entire business model of these social media companies is to increase the engagement of their users. “This isn’t a unique observation, but it’s a crucial one: If you’re not paying for the product, the product is you” (McFarlane). In their beginning days, social media platforms such as Facebook, Instagram, and Twitter would show a user only the posts from the people that user followed, and the posts would arrive on what’s known as the “feed” in order of recency, with the most recent posts showing up first, and older posts showing up later. However, as these companies developed their business models, the structure of the platforms were altered. If the corporations can find ways to persuade the users to spend more time on the platform, then the users see and interact with more advertisements, and advertisers will pay the companies more money for this attention. With this incentive, the social media companies have developed algorithms which sort through all the data about how users interact with content, and they have optimized these algorithms to increase user engagement. This means that the old model of showing posts in the order that they were posted had to go out the window, as it was not the most effective way to keep people engaged. These algorithms found that a user is more likely to spend more time on the platform if he or she is shown posts similar to ones that person previously liked, commented on, or otherwise engaged with. And because users are more likely to engage with posts that appeal to the beliefs they already have, the algorithms will feed content to each user that contains information similar to the beliefs that user already has, even if it’s from an account that user doesn’t even follow. This effect compounds itself over and over as users spend more time on the platform, until eventually users are logging on to their social media accounts and only finding information that confirms their prior beliefs, with no facts or arguments to the contrary.

This effect creates something which is known as an echo chamber. An echo chamber can be defined as a group setting in which a person is surrounded exclusively by beliefs or opinions that overlap with his or her own beliefs, so that the existing views are reinforced and alternative ideas are not considered. This person, having his or her own beliefs amplified, can become vocal in the group setting and then be part of the reinforcing of others’ beliefs as well. In terms of social media platforms, what happens is that users log on to a social media platform and they already have some beliefs about which viewpoints are correct and which viewpoints are incorrect. Then they get pushed by the platform’s algorithms into an echo chamber where they only get exposed to information that reinforces these beliefs. This results in those beliefs becoming much more strongly held, and opposing viewpoints are never encountered in their real form; they are only seen through the biased lens of the echo chamber. Anyone who believes the opposing viewpoints can be freely ridiculed within the echo chamber, as no genuine rebuttal has the opportunity to be put forth. And this isn’t just some hypothetical scenario made up by people who are afraid of big tech companies, there is actual data proving that this is what happens on social media. One study which sought to determine whether or not echo chambers were created by social media algorithms found “support for the hypothesis that platforms implementing news feed algorithms like Facebook may elicit the emergence of echo chambers” (Cinelli, Matteo, et al). However, there are some who think that these effects are overblown. One study that has been cited involved comparing the search results from a group of Republicans and a group of Democrats. “What this found essentially was that the results that people got when they searched for political topics were more or less the same” (Fletcher). Studies such as this one are often cited in order to argue that the effects of social media on echo chambers and polarization are exaggerated. But what these people miss is that this study doesn’t actually examine the effects of the algorithm. If a user searches for a topic on social media, the results will not be biased by that user’s past engagement. Where the algorithm really takes effect is in a user’s feed, not in their search results. And the vast majority of time spent on social media is scrolling through the feed, not searching for specific topics—meaning that the echo chambers are created in a user’s feed, and therefore performing a study on search results would not be useful. And because there are no rebuttals to anything put forth within these echo chambers, more radical ideas will get more positive reaction from those within the chamber (when compared to moderate ideas), and those ideas will therefore get boosted by the algorithms to a larger audience. This extra engagement provides an incentive for content creators to make posts containing more radical ideas, which is sufficient for a positive feedback loop wherein those inside the echo chamber are continuously shown more and more radical information and they move further and further away from moderation. So to recap, social media algorithms result in users seeing more information that reaffirms their beliefs, spending more time discussing these beliefs with like-minded people, spending less time discussing with people who disagree, and consuming more radical perspectives.

Knowing all this information about the effects of social media algorithms gives us much more context to analyze the current state of the world’s discourse. Perhaps prior to the rise of the internet and social media, the dialogue about important topics such as politics or religion would have occurred cordially at dinner where everyone is face-to-face and can see each other’s emotions and reactions to the discussion. Perhaps there would be a variety of people with multiple different perspectives to balance each other out and help moderate the discussion, which could take place over the course of a few hours. However, these discussions can now take place in the comments section of a Twitter post, where statements and rebuttals are limited to 240 characters, the participants are hidden behind private, faceless profiles, and there are no consequences for what is said, whether it’s a radical idea or a personal insult. And most impactful of all, these discussions take place on the same platforms where the users have been radicalized into subscribing to a more extreme version of their beliefs. With many of the users having been dragged into echo chambers that diametrically oppose one another, the platforms are set up for interactions that showcase the result of an unstoppable force meeting an immovable object—complete chaos. Polls and studies quantify what most social media observers know to be true. The same discussions that people previously found to be intellectually stimulating are no longer so, with 59% of people finding political discussions on social media to be stressful and frustrating, and 64% of more politically engaged users finding these discussions to have less respect involved (Duggan and Smith). And the data reveals more worrying things than this: “approximately one in four Americans reported that they had been harassed online, and three out of four said that they had witnessed others receiving such treatment” and “the leading reason people gave… was their political views” (Bail). There could be any number of studies, polls, and analyses cited at this point to continue showcasing how people feel about their political interactions on social media, but anyone who has logged onto one of these platforms has seen it first-hand. The interactions amongst those who belong to opposing echo chambers is far from cordial. There are insults slung more often than genuine propositions, facts are ignored, studies—in the off chance that they are even cited—are brushed aside, and everyone is competing for the highest number of “likes” or “reposts” in order to validate their opinion. These are the effects of social media and their algorithms. The creature that started out with the intention of increasing the amount of time and interaction that the users spend on these platforms slowly evolved and iterated into something entirely different from its origin—something likely unrecognizable to the founders of these platforms back when they initiated their creations.

So, the algorithms of social media platforms have brought the world into a place far from the time of civil discourse, when ideas would be discussed in open forum and the civilization would have calm, rational disagreements and then society would move forward together. So how can society get back to the time of the open forum? Julia Galef has some good advice in her book The Scout Mindset: Why Some People See Things Clearly and Others Don’t. Among the advice she gives, the best thing that people can learn is to change their minds frequently, but in small increments. “If you see the world in binary black-and-white terms, then what happens when you encounter evidence against one of your beliefs? The stakes are high: you have to find a way to dismiss the evidence, because if you can’t, your entire belief is in jeopardy” (Galef). Instead, people should be more accepting of being slightly wrong, and view it as an opportunity to increase the accuracy of their knowledge. If this were the case, then the social media algorithms would have less of an ability to create echo chambers, as someone would introduce new evidence to the chamber, and rather than tossing the evidence aside, the evidence would be examined, and if necessary, the beliefs of the users would be updated accordingly. However, getting people to accept this mindset might be more of a challenge, and will likely require a much more in-depth discussion.

 

 

 

 

Resources

Bail, Christopher A. Breaking the Social Media Prism: How to Make Our Platforms Less Polarizing. Princeton University Press, 2021.

Carr, Nicholas G. The Shallows: What the Internet is Doing to Our Brains. W.W. Norton, 2010.

Cinelli, Matteo, et al. “Echo chambers on social media: A comparative analysis.” arXiv, April 2020, https://arxiv.org/pdf/2004.09603.pdf.

Duggan, Maeve, and Aaron Smith. “The Political Environment on Social Media.” Pew Research Center, 25 October 2016.

Fletcher, Richard. “The truth behind filter bubbles: Bursting some myths.” Reuters Institute, 24 January 2020, https://reutersinstitute.politics.ox.ac.uk/news/truth-behind-filter-bubbles-bursting-some-myths. 

Galef, Julia. The Scout Mindset: Why Some People See Things Clearly and Others Don’t. Little, Brown Book Group Limited, 2021.

McFarlane, Greg. “How Facebook (Meta), Twitter, Social Media Make Money From You.” Investopedia, 4 11 2021.

 

 

Scroll to Top