Dear Facebook, I’d Like a Do-Over Please

Hey, Zuckerberg–can I get a do-over on the whole Facebook algorithm thing? [Notice how confident I am that Mark Zuckerberg is an avid reader of my work]

If you said to me, “Look, Tony. There are far too many posts flooding Facebook for us to show you everything, so instead we want to figure out what you like and what you prefer so we can prioritize those posts. Please answer this questionnaire to help us identify your preferences,” THAT would be awesome. Even if you said, “We’re going to start using an algorithm to prioritize your posts based on what you Like, share, and comment on. That preference gathering period starts…NOW!” that would have been cool too.

But, you didn’t. Facebook is filtering my content based on a personalized algorithm that was created when I didn’t know the test had started. I’d like a reset so I can make better choices and clean that algorithm up a bit.

Help. I have apparently created my own personal echo chamber of sorts on Facebook. I have unwittingly painted myself into a corner where I live inside a bubble. Escape is possible but tedious. Facebook needs to provide some “reset button” that lets me wipe the slate clean and start over with a fresh algorithm.

Facebook made the decision a while ago to filter the posts you see by default based on an algorithm. You have control over that algorithm to an extent—Facebook determines what to show you, and what not to, based on your history of the types of posts you’ve “Liked” or commented. There is an overwhelming amount of information flowing through Facebook, and I understand the value of something like the algorithm to try and sift through the noise to prioritize the posts you’re most likely going to prefer anyway.

Here’s the problem, though—and the reason I am not a fan of filtering by an algorithm. I didn’t know I was making decisions that were going to be used against me later. I didn’t know that the posts I Liked, shared, or commented on were part of some social experiment that would eventually dictate—more or less—what kinds of posts I get to Like, share, or comment on.

Yes, it makes sense to filter the countless posts on Facebook proactively undefined to try and show me the content I am most likely to be interested. It also creates a self-perpetuating information bubble. Someone who frequently Likes, shares, or comments on conservative or right-wing posts is going to see a higher percentage of those posts in their feed. That has two consequences: 1) They end up with a potentially false sense of confidence in a particular point of view. It seems popular—even if it’s considered ludicrous by most of the world—because everyone in their social network bubble is talking about. And, 2) It means that they’re less likely even to be exposed to opposing or alternative viewpoints because the algorithm focuses on showing them stuff with which they agree.

See the full story on Forbes: Facebook Needs A Reset Button To Wipe The Algorithm Clean.

Scroll to Top