The way social media divides us…

Photo by Pixabay on

More than ever we connect through social media; our information comes through social media; ur purchases are made because of recommendations on social media. Increasingly, research is showing us how social media is shaping the very nature of our societies.

If our hope is for a move towards kinder and more egalitarian ways of organising ourselves, we have to grapple with this reality and try to understand how the internet influences our collective and individual consciousnesses.

One case study very salient at the moment is that of Ashli Babbitt, the woman shot and killed during the recent invasion of the US Capitol building by protestors radicalised in part by the ravings of outgoing President Trump. How did this woman journey from being an Obama voter in 2012 to the point were she prepared to break down congressional doors to express her rage against the perception of corruption? How did a seemingly sane and responsible person become convinced that the reality that made the most sense was that perpetrated by right wing extremists and, perhaps most puzzingly of all, adherants to the bizarre QAnon conspiracy theories?

We all know people like Ms Babbitt; friends and members of our families whose views have become increasingly extreme. From the outside it often seems as though they have been inculcated into a cult, right there in their very living rooms. No reasonable argument can reach them. No contrary evidence can challenge the vehemence of their belief systems. The impossibility of challenge seems to be exacerbated online, as if any contrary view is a kind of threat to their very being.

Photo by Lina Kivaka on

The way that this works may be a mystery to us, but we now know with a high degree of certainty that our vulnerability to manipulation on social media has already been exploited to undermine the very fabric of how we understand democracy. The barely-noticed UK Parliamentary report into Russian interferance by the Intelligence and Security committee for example made the following startling statment that Russian interferance in UK elections is the ‘new normal’, including seeking to influence the Scottish independence debate. The report was not able to comment on the degree to which Russia interfered with the Brexit referendum (as widely reported elsewhere) because our intelligence services had not even sought to investigate.

Then there was the shady work perpetrated by Cambridge Analytica. We now know exactly how a vast data mining operation (our social media data that is) was weaponised by both the Trump campaign and the Brexiteers. The way this was done was both simple and complex. Algorithms identified individuals whose leanings could be influenced, and their social media feeds were fed a diet of material that pushed their views in one particular direction. Remember that most elections in our democracies are decided by small percentage shifts of opinion, filtered through the narrow bottlenecks of a yes/no or choice-between-three ballots. Mass participation in social media platforms are an incredibly powerful tool in achieving mass influence, from which none of us who participate are immune.

The secret, it seems, is to try to identify our prejudices, then fan them in to flames.

Don’t take my word for it, check out this interview with one of those who did the job;

This should worry us all.

We have to look for ways to protect our democratic systems from such power, concentrated as it is into the hands of tech companies such as Facebook and Twitter. There is a strong argument for breaking up the companies, who are after all now operating with the sort of impunity only possible for monopolies. We have to regulate their access to our personal data and the way it is used for both corporate profit and political influence.

But we also have to acknowlege out own culpabiity within this, not just because we are the addicted, who are unlikely to let go of the social media drug easily, but also because, as mentioned already, all of us are vulnerable. It is easy to point a finger at some distant facebook friend not seen in person since school days, or at Ms Babbitt, but the interesting question we have to ask ourselves is the degree to which our own views have been shaped by our engagement with social media. The best people I know are those who take hold of their beliefs with passion and hold on to them with integrity. They believe strongly in the rightness of their causes and the correctness of their politics. The value judgements they made to arrive at these decisions are precious to them, foundational to their sense of identity. I suppose I am describing myself- or at least the version of myself that I choose to believe.

The problem is that we ALL believe this about ourselves, to a lesser or greater degree. The problems start when we come up against others, particularly on social media, who have different views. It can get ugly very quickly. We know that engagement in this kind of battle is totally futile, and that changing the minds of your opponents is vanishingly unlikely, but we do it anyway. It feeds the worst instincts of some, who seem to derive great pleasure from inflicting pain on people they will never meet.

Surely it should bother us that over the last couple of decades, more and more people are moving from the middle ground towards holding extreme views on a wide range of subjects? That is fine if the ‘extreme views’ are ones that we share, but the reality in wider society is that we are becoming more polarised and divided. Those things that we agree on are increasingly defined against those things which we hate.

Perhaps, then, there is something about the way social media works that, rather than it’s stated aim of connecting people and facilitating discourse, actually has the very opposite effect? It is perhaps no suprise that a growing body of research is finding just that very thing.

This fromThe Conversation;

When people express themselves through social media, they communicate collectively. Rachel Ashman, Tony Patterson and I studied sharing of images of food in an intensive three-year ethnographic and netnographic study of a variety of online and physical sites. We collected and analyzed thousands of pictures, conducted 17 personal interviews and set up a dedicated research webpage where dozens of people shared their “food porn” stories.

Our results indicate that people share images of food for a number of reasons, including the desire to nurture others with photos of home-cooked food, to express belonging to certain interest groups like vegans or paleos, or to compete about, for example, who could make the most decadent dessert. But this sharing can become competitive, pushing participants to one-up each other, sharing images of food that look less and less like what regular people eat every day.

Here is how it works. Many people start by sharing food images only with people they know well. But once they broaden out to a wider group on social media, several unexpected and startling things begin to happen. First, they find sites where they can feel comfortable expressing their opinions to a like-minded “audience.”

This audience creates a community-type feeling, expressing respect and belonging for certain kinds of messages and outrage or contempt for others. Communications innovators in social media communities often also create new language forms, such as the frustrated guys in men’s-rights-oriented social media forums on Reddit bringing new life to the 19th-century word “hypergamy,” or young people creating sophisticated emoji codes in their relationship texting.

Through language and example, community members educate one another. They reinforce each others’ thinking and communication. Members of social media communities direct raw emotions into particular interests. For example, a general fear about job security might become channeled through the feedback loops on Facebook into an interest in immigrant jobs and immigration policy.

Those feedback loops have even more sensational effects. People use social media to communicate their need for things like money, attention, security and prestige. But once those people become a part of a social media platform, our research reveals how they start to look for wider audiences. Those audiences show their interest and approval by liking, sharing and commenting. And those mechanisms drive future social media behavior.

A monstrous example of ‘food porn.’ Priyan Shailesh Parab

In our study of food image sharing, we wondered why the most popular food porn images depicted massive hamburgers that were impossible to eat, dripping with bacon grease, gummy worms and sparklers. Or super pizza that contained tacos, macaroni and cheese and fried chicken. The answer was that the algorithms that drive participation and attention-getting in social media, the addictive “gamification” aspects such as likes and shares, invariably favored the odd and unusual. When someone wanted to broaden out beyond his or her immediate social networks, one of the most effective ways to achieve mass appeal turned out to be by turning to the extreme.

Taking an existing norm in the community (massive burgers, say) and expanding upon it almost guaranteed a poster a few hundred likes, a dozen supportive comments and 15 minutes of social media glory. As each user tried to top the outrageous image of the user coming before, the extremes of food porn ratcheted toward ever more sensational towering burgers and cakes. Desire for what was once the extremes began to seem normal. And the ends separated farther from the few who remained in the middle.

What better way to connect with your potential supporters than to say something outrageous, not matter how untrue, then double down on the same lie? How many retweets is THAT going to get you?

The point here is clear. Social media rewards us for moving towards extremes, and then these extremes become normalised. Then we get Brexit, and Trump, and extreme politics, extreme religion.

We get greater division too, because social media will sieze upon any division and magnify it. Our political opponents – take the independence debate in Scotland for example, already mentioned as a target of Russian interferance – are never just people with different views, they are idiots, losers, imperialists, English numpties and clowns. I am no more immune to this than the next person. Only you can answer that question about yourself.

That is not to say that ideas – even seemingly extreme ideas – are not important. Perhaps we need good ideas more than ever. The issue at hand here is two-fold; how do we form our ideas and how do we seek to employ them.

MLK was certainly viewed as an extremist, but the application of his extreme ideas was aimed towards reconcilliation and healing. He wanted to change the system towards peace and in favour of the poor and oppressed. This has been called communism. I call it justice, and there is no algorithm for that other than that which we had already.

5 thoughts on “The way social media divides us…

  1. Have you seen the film The Social Dilemma ??? Made about and by former social media pioneers who explain all that is really going on and why they left and would never let their children use any of it for as long as they can keep them away!!

  2. Thanks Chris – a useful & thoughtful post. An image which keeps coming to my mind when thinking about this phenomena, is the famous wobbly bridge effect – when people have this innate propensity to move in unison, even when it makes something quite uncomfortable or even dangerous happen, completely blind to their involvement and ability to change what’s happening. I think perhaps a similar effect happens in the “collective conscious” too.
    I suspect the feedback algorithms in social media (ie. which posts you see first) are very much causing this effect (the argument of The Social Dilemma mentioned previously), and perhaps in the same way engineers can modify the structure of bridges, software engineers also have the capacity to contain & modify this feedback effect. However, tjhe problem is, it is not directly in their interests to do so, as they effectively cause customer disengagement in their advertising platform by reducing this addictive feedback.

    • Thanks Paul. I like your bridge analogy, which in turn reminds me of an old social psychology experiment, in which it was demonstrated that groups tend to take riskier decisions that individuals- a phenomenon known as the ‘risky shift’. You may be right that there is a technical/programmable way to mediate this problem, based around a change in the algorithm, but this may well be a constantly moving target, as vast amounts of money can be made subverting this process. I think the only answer has to be to find a solution that is human scale, that everyone can understand, and this has to be about regulation and breaking up of monopoly.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.