The Anti-Algorithm: The KIND Foundation Wants to Pop Your Facebook Filter Bubble
The Anti-Algorithm: The KIND Foundation Wants to Pop Your Facebook Filter Bubble
The Anti-Algorithm: The KIND Foundation Wants to Pop Your Facebook Filter Bubble
by

Let’s do a quick experiment…

Head over to Facebook and scroll through your News Feed. What do you notice? You may see posts that you agree with, from friends sharing their stances on the issue of the day. You may also see articles that validate your beliefs. What’s missing? Perhaps views that challenge your own, or content that informs rather than affirms.

Here at The KIND Foundation, we started exploring this phenomenon a few months ago, as the conversation around “echo chambers” picked up. In partnership with Morning Consult, we surveyed 2,400 Americans to better understand why people are not seeing differing points of view online. What we found reinforced the severity of our echo chambers—only 5% of us see social media posts that differ greatly from our world view. While we’ve always assumed social media has the power to connect us, it’s clear that it can also insulate us.

This Phenomenon Isn’t New

Six years ago, Eli Pariser published “The Filter Bubble,” illuminating that personalization was insidiously overtaking the Internet experience. The crux of his argument was that tech companies relied on algorithms to serve you content you’ll literally “like,” or generally enjoy or agree with. He warned that these algorithms could have negative consequences over time. He was, of course, on to something. In the aftermath of the U.S. election, the term “filter bubble” surged both in analysis and search. At this point, this issue was brought into clearer view and reinforced by a constant deluge of like-minded viewpoints in our feeds.

The mood nationwide demonstrated the perpetuation of another troubling trend: polarization levels. This is something Robb Willer, a Stanford professor, took on in a recent TEDx Talk. Willer suggests that it’s human nature to reject opposing points of view and surround ourselves with people we agree with. He asserts that one way we can chip away at polarization is by hearing out “the other side,” and approaching them with empathy and respect.

Designing the Anti-algorithm

When we established The KIND Foundation we set our sights on trying to foster kinder and more empathetic communities. We’ve largely carried out this mission offline, in communities across the country. Knowing that digital connections are here to stay and people are spending more time online, we decided to try something online. We dipped our toe in the social media waters and unveiled a digital tool that will help Americans integrate diverse perspectives into their Facebook feeds. It’s called “Pop Your Bubble.”

In developing this tool, we felt it was important to flip the notion of a traditional algorithm on its head since that’s partly how we got into these bubbles in the first place. While most of the algorithms we encounter match us with things they think we’ll enjoy, the Pop Your Bubble algorithm does the opposite.

After reviewing a user’s Facebook profile and activity in detail, the tool suggests people whose profile and activity are least like theirs. Specifically, it matches people based on geographical and generational differences, as well as differences in opinion (based on past likes and shares). Users are then encouraged to follow (not friend) at least 10 new people, whose content will gradually start to populate their feed. Those who want to go a step further can add their profile and allow future users to follow them.

Our goal is to help create space for listening and, hopefully, understanding. Along the way, we may even realize that our differences aren’t as insurmountable as they seem.

So, are you ready to pop your bubble? Give the tool a try here: www.popyourbubble.com and take a look at some others who have just done the same (below):

About the Author

Elle Lanning
Elle Lanning is KIND’s Chief of Staff and Advisor to The KIND Foundation, a nonprofit established by the healthy snacks company.
  • Leon Galindo Stenutz

    In the wake of current events in the US and other emergent tendencies towards radicalization and separation in Europe, North America, and other regions, i was thinking about the deep causes — and potentially deep solutions — to these apparently complex, yet ultimately simple, social & political phenomena.

    Was right about to start writing a short essay on “Anti-Anti Algorithms” — on designing algorithms to counter extremism, radicalism, hatred, and separation of any sort… but stopped to check first what is already out there. Happy to see there are several essays, a TED Talk from a Stanford professor, and even an app already out there based on similar thinking. All good and necessary ideas come to fruition sooner or later :)

    Congrats to all the good people out there thinking along these lines.

  • Tobias Törnblom

Newsletter

Social Media Jobs


VIP Explorer’s Club

Archives