Have you had the experience of looking at a product online and then seeing ads for it all over your social media feed? Far from being a coincidence, these oddly accurate ad examples provide insight into the hidden mechanisms that power something you search for on Google, “like” on social media, or encounter while browsing personalized ads on social media. social.
These mechanisms are increasingly being used for more nefarious purposes than aggressive advertising. The threat lies in how this targeted advertising interacts with today’s extremely divided political landscape. As a social media researcher, I see how people who seek to radicalize others use targeted advertising to easily lead people to extreme views.
Advertising is clearly powerful. The right advertising campaign can help shape or create demand for a new product or rehabilitate the image of an older product or even an entire company or brand. Political campaigns use similar strategies to push candidates and ideas, and historically countries have used them to wage propaganda wars.
Mass media advertising is powerful, but mass media has a built-in moderating force. When trying to move many people in one direction, the mass media can only move them as fast as the center tolerates it. If it moves too far or too fast, the people in the middle can be alienated.
The detailed profiles that social media companies create for each of their users makes advertising even more powerful by allowing advertisers to tailor their messages to individuals. These profiles often include the size and value of your home, the year you bought your car, whether you are expecting a child, and whether you buy a lot of beer.
Therefore, social media has a greater ability to expose people to ideas as quickly as they will individually accept them. The same mechanisms that can recommend a niche consumer product to the right person or suggest an addictive substance just when someone is most vulnerable can also suggest an extreme conspiracy theory just when someone is ready to take it. ‘to consider.
It is increasingly common for friends and family to find themselves on opposite sides in highly polarized debates on important issues. Many people recognize that social media is part of the problem, but how do these powerful personalized advertising techniques contribute to the divisive political landscape?
A significant part of the answer is that people associated with foreign governments, without admitting who they are, take extreme positions in social media posts with the deliberate aim of instigating division and conflict. These extreme posts take advantage of social media algorithms, which are designed to increase engagement, meaning they reward content that elicits a response.
Another important part of the answer is that people who seek to radicalize others are forging crumb paths to increasingly extreme positions.
These social media radicalization pipelines work in much the same way whether recruiting jihadists or January 6 insurgents.
You may feel like you’re “doing your own research” from source to source, but you’re actually following a deliberate radicalization pipeline designed to push you towards increasingly extreme content, whatever whatever pace you will tolerate. For example, after analyzing over 72 million user comments on more than 330,000 videos posted on 349 YouTube channels, researchers found that users are constantly migrating from softer content to more extreme content.
The result of these radicalization pipelines is obvious. Rather than most people holding moderate views with fewer people holding extreme views, fewer and fewer people fall in the middle.
To protect yourself
What can you do? First, I recommend a huge dose of skepticism about social media recommendations. Most people went on social media looking for something in particular, then found themselves looking up from their phone an hour or more later, not knowing how or why they were reading or watching what they had just done. It is designed to be addictive.
I’ve tried to carve a more deliberate path to the information I want and actively try to avoid just clicking on what’s recommended to me. If I read or watch what is suggested, I wonder “How could this information be in someone else’s best interest, not mine?”
Second, consider supporting efforts to require social media platforms to offer users a choice of algorithms for recommendations and stream curation, including those based on simple-to-explain rules.
Third, and most importantly, I recommend investing more time in interacting with friends and family outside of social media. If I need to pass on a link to make a point, I consider that a red flag that I don’t understand the problem well enough myself. If so, I may have found myself following a trail built toward extreme content rather than consuming materials that actually help me understand the world better.
Jeanna Matthews is a professor of computer science at Clarkson University.
This article first appeared on The Conversation.