In the post-truth world, where Donald Trump is still one of the most powerful men on the planet, a current debate on people’s minds is about the true influence of social media in our decision-making.
In the post-truth world, where Donald Trump is still one of the most powerful men on the planet, a current debate on people’s minds is about the true influence of social media in our decision-making.
Before attempting to answer this question, it is important to clarify how content delivery mechanisms work on social media platforms. On Facebook, for example, an average of more than 300 posts are displayed daily on each user’s timeline. However, these posts do not represent the total content produced by friends and liked pages in a day.
This displayed set of posts is selected by an algorithm, which is a set of codes programmed to build user profiles. From these, the system decides what type of content will be displayed and which will not. These profiles take into account factors such as friends, likes, time scrolling over content, comments, and user interests, among others.
As preferences and interactions change, the algorithm learns and adapts to always display the content that it considers most appropriate for the user according to the profile outlined.
This means there is a tendency for us to get into a filter bubble—commonly called echo chamber—since the algorithm tends to send us content that is aligned with our preferences, interests, and ideas. Eli Pariser, author of The Invisible Filter, describes this movement as ‘bubble filters’, in which algorithms personalize our content on social networks and also our search results in search engines.
Facebook has more than 1.8 billion active users worldwide, according to Statista. The Social Media Update report by the Pew Research Center shows that six out of 10 Americans have social media as their main source of news, with 66% of the survey sample using Facebook as the main platform.
Among the users, from the 26 countries surveyed by the Reuters Institute for the Digital News Report, 51% of users reported accessing news through online social networks, with Facebook being the preferred service (44%). In Brazil, the total number of people who use social networks as their news platform exceeds 72%.
According to a survey published Facebook researchers in the scientific journal Science, those who share their ideological affiliation on Facebook, on average, have only 23% of friends with opposing affiliations. With regards to what people see on the timeline, on average, only 28% of the news are in contrast with the user’s beliefs. In addition, people tend to click on news that show ideologies similar to theirs. Only 25% of the news we click on show news with ideologies different than ours.
Although Facebook has more than once stated that they are not a media company, user behaviour contradicts this, as seen on the surveys cited above. People are using social media networks for informative content, such as journalistic content or otherwise.
Professor Benjamin Bratton, from UC San Diego, argues that such platforms should be seen not just as media but as regulatory systems. That is, “bubble filters” are just one point in the great whirlwind of behavioural changes that are being leveraged by online social networks.
Vicente Serrano Marín, author of Fraudebook, talks about social networking platforms as biopolitical devices, adapting Foucault’s notion of the term to present times. He states that in social networks, life must be understood based on emotional life and power as the diversity of ideological discourses.
Marín explains that, according to Foucault, life is projected by the way individuals represent themselves in biographical discourse. That is, all state-organized control systems are insufficient to contain the symbolic and discursive representation necessary for biopolitical control at a time when they do not include the devices with which people live and represent their own lives.
Here, we can see the passage from a disciplinary society to the society of control when the power exercised by devices directly organizes the brains and bodies toward a state of autonomous and self-vigilant alienation. Biopower is a form of power that manifests and regulates social life within, interpreting and reformulating it.
Actually, biopolitical devices exert their control by producing and regulating habits and productive practices, pushing users to produce and share all levels of their lives and normative discourses. Social relationships are mediated by vents that encompass everyone in social life elements.
Besides the content filtered by the algorithm, the circulated content’s intentional manipulation also modifies biopower. As if human interactions within the platform were not enough, the use of social media bots was addressed in a study published on First Monday. According to the research, they can effectively influence the political discussion on the social media, particularly more toward negative and polarizing positions than to broadening the debate.
The survey looked at 20 million tweets posted between September 16 and October 21, 2016, and found that around 400,000 robots were engaged in online conversations and accounted for 3.8 million tweets, about one-fifth of the total.
According to researchers, the presence of bots in the political discussion can raise three main issues: a) influence can be redistributed between suspicious accounts that can operate with concealed purposes; b) the conversation becomes polarized; and (c) the spread of false and unverified information may increase.
Another research published by the University of Oxford also concludes that in the U.S. elections, a “pro-Trump robot army” overcame Clinton’s digital onslaught. According to the researchers, the main tactic was to confuse people and to modify discussions, mainly with the use of fake news, but also with manufactured memes and images. The survey claims that automated pro-Trump posts exceeded the same kind of pro-Hillary message, fivefold. Of 19 million tweets evaluated in the last week of the campaign, 55% were pro-Trump and 19% pro-Hillary. According to the same Oxford researchers, the use of political bots also played a major role in UK’s Brexit. This type of activity is known as “computer propaganda”.
In other words, the actual biopolitical systems act in different and more complex ways. Given recent events, it has become clear that there is a need for further studies on these systems, as well as regulations from companies controlling these platforms. Acting according to this way, it will be possible for a more transparent and ethical use of the new digital democracy.
By Stefanie C. da Silveira and João Gabriel D. Morisso
Language Quality Assurance Reviewer
Albina Retyunskikh