What if, contrary to popular belief, our experiences in life actually narrowed our perspective rather than broadening it?
And what if, in this digital era, the range of information available to us were shrinking rather than expanding – thereby aggravating the disturbing trends toward political, social and economic polarization worldwide?
Indeed, today we may well be witnessing a new wave in history characterized by snowballing momentum toward greater, deeper divisions. Not only can the lack of consensus lead to US government shut-downs, but conversations around issues such as gun control and immigration reach bitter, protracted stalemates.
The media, with its notorious penchant for left or right and its license to interpret events, has often been cited as a contributor to this polarization process. Naturally, audiences gravitate toward news outlets presenting views closest to their own. It’s a fact of life, and there is little if anything we can do to control this.
Less known is the extent to which this plays out in the social media-sphere. It’s much harder to make categorical choices about the content we’ll stumble upon on Twitter, YouTube or Facebook. Yet whether we’re aware of it or not, social media is exacerbating the chasms pitting left against right, pro-choice against pro-life and NRA supporters against those who campaign for more gun control.
In a world where millions upon millions interact, share views and learn about the world through these platforms, we assume that the variety of content we receive is exquisitely unlimited. On the contrary: the way we use social networks and the way these networks present us, in turn, with content we’re likely to consume produces what author Eli Pariser calls “filter bubbles.”
The concept aptly applies to various degrees to all online social networks. These networks quickly learn our preferences when we engage with our friends and connections — who are likely to share our views. When we “like” a post, the network we’re using will make sure that we receive similar content in the future, essentially creating an information bubble around us. As the Wall Street Journal has noted, scholars worry that this can create “echo chambers” where users see posts only from like-minded friends and media sources.
To be affected by social media filters and bubbles, users don’t even have to engage with the content. Matt Honan, a writer at Wired, discovered that by liking everything he came across on Facebook for two days he not only saw his feed change dramatically, but his behavior ended up influencing his connections’ feeds as well. Thank you, intertwined algorithms.
YouTube is no different. We recently decided to watch a couple of videos promoting Brexit. After a short time, we observed that there were no more Remain-themed videos among the recommendations listed on the page where we’d landed. The more videos we watched from the recommended list, the more extreme the filtering became: even the home page was soon filled with Brexit-leaning content. Content about Remain was reduced to practically zero. Crucially, we did not even have to “like” anything to get to this point.
Unfortunately, the problem goes even deeper. When we do experience views different than our own in our social media feeds, these tend to be extreme and unrepresentative. For example, the Remain-leaning content shared extensively by Brexit supporters (or vice versa) during this campaign tended to be hard-core and excessive, with over-the-top messages expressed in exaggerated ways. This further alienates people who don’t share the content’s views, deepening the divides. Then, in a vicious cycle, social networks’ algorithms detect the resulting polarization and feed into it, leaving little chance for extreme content to be challenged. Positions harden, ideologies calcify.
A Problem of Learning from Experience
This is dangerous, mainly due to the process by which our brains intake and interpret experience. We are hardwired to learn from experience, seamlessly and automatically. Our senses collect the information offered by the environment around us and feed our intuition, which then fuels our judgments, decisions and behavior. Unfortunately, we typically don’t take the time to identify possible biases in what we observe and to think critically about our experience. Nobel Laureate Daniel Kahneman calls this syndrome WYSIATI (What You See Is All There Is). Hence, when we are exposed to censored or filtered information our perception and intuition are profoundly affected. We are led astray by our own experience.
If constant exposure to views that are either in line — or extremely out of sync — with our own is what’s shaping our intuition, and with the opportunity to receive content that balances out our personal predilections in a more moderate and credible fashion constantly shrinking, how can we hope to reverse direction and move closer to a place of mutual understanding or even consensus? Especially given that most of us don’t actively seek to learn about or understand views that oppose our own?
The Quest for Solutions
Just as online social networks are, unexpectedly, one of the drivers of polarization, can they also have the potential to offer us solutions? While the problem of filter bubbles and the resulting polarization have been widely discussed since the early 2000s, to date only a few solutions have been explored.
One attempt is by Facebook, which recently announced that it’ll build algorithms to reduce clickbait: news stories built primarily to attract attention. From the point of view of experience, this move could reduce the effect of extreme, unrepresentative and controversial content on users’ intuitions.
In another attempt, computer scientists Eduardo Graells-Garrido, Mounia Lalmas and Daniel Quercia have envisioned a reconciliatory recommendation system that helps people identify various common interests they share with those who hold opposite views about sensitive and polarized topics.
A recent study by information scientists Sean Munson, Stephanie Lee and Paul Resnick analyzed the effectiveness of a browser extension they named “Balancer,” which provides users with feedback about the political lean of their past reading behavior. They report that their tool has indeed encouraged some users to widen their perspective by reading more about opposing views.
Finally, there’s the online news aggregation tool NewsCube, which allows readers to combine multiple sides of any given issue through a single, easy-to-consume interface.
Albeit encouraging, if you are hearing about these proposed solutions for the first time here, it means that we need more of them.
The Yin Yang Button: An Algorithm For Common Ground
We propose another idea, one that aims to unbias experience. Consider a new social media button, similar to the “like” button, designed as a yin yang symbol. The ancient Chinese philosophy of yin and yang is renowned for describing how opposite or contrary forces are actually complementary, interconnected and interdependent. Likewise, the yin yang button could provide balance and completion to one’s experience. Clicking it on any Facebook, YouTube or Twitter post would signal that you find that the post’s content provides a relatively balanced representation of multiple views and seeks primarily to find common ground among them. For instance, Vox’s recent article, which argues that one can simultaneously oppose violence both from and towards the police, would presumably receive a high number of “yin yang” clicks.
Then social media’s algorithms would go to work to ensure that content receiving a high number of “yin yang” clicks would be recommended to everybody following related news stories, regardless of what they saw or “liked” previously on the issue. Of course, there may be users whose own extreme views skew their interpretation of what makes for “balanced” content or what constitutes “common ground,” so a certain critical mass of “yin yang” clicks would be needed for this process to kick in. It might also be helpful if users could sort their searches based on content’s yin yang ratings.
The result would be to burst the social media filter bubble by rounding out our experience around news and current issues, introducing more variety into the information flow that feeds our intuition, judgment, decisions and behavior. We’d thus become better equipped and better able to compromise and seek common ground.
By mitigating the influx of extreme content we receive and allowing us to identify and discard it, this mechanism would also help avoid misperceptions about views and ideologies we don’t share.
Philosopher Marshall McLuhan famously said: “We shape our tools and thereafter our tools shape us.” Hence over time, perhaps, the yin yang button could in itself become a driver of more empathetic and common ground-seeking content as users, motivated by yin yang clicks in the same way so many of us are motivated to collect “likes” or “thumbs up” seek to create posts more likely to receive them.
Perhaps, too, those users that consistently produce yin yang-friendly content might emerge as a new generation of thought leaders, akin to LinkedIn influencers or Most Viewed Writers in Quora, whose voice in the conversation might ultimately help bridge the profoundly troubling divides we’re facing.
- A Yin Yang Social Media Button Could Help Stem the Tide of Polarization - October 7, 2016