photo via Kit O'Connell

photo via Kit O’Connell

I’ve been seeing a lot of posts on Facebook post-election about how we’re all in our own echo chambers and there’s too much fake news being shared on Facebook.

So many people are arguing that we all need to research articles before liking or sharing, instead of just sharing something an article or meme because it lines up with our preconceived notions and beliefs.

Something about those arguments has felt off to me, but I couldn’t quite name it.

Then, Facebook memories showed me an article I shared last year when that fake Meryl Streep quote was going around. The article argues that what we share on-line has more to do with how we want to believe the world works than with the way the world actually works.

And I suddenly realized why I think the arguments I’m seeing are missing something important.

Yes, it’s true that Facebook’s algorithm filters out posts it doesn’t think we want to see. And yes, that’s a problem.

But not in the ways everyone seems to believe. Instead, it’s actually a symptom of an even larger problem with the narratives we create around social media itself and the assumptions we make about why people use it.

Google+ never really caught mainstream attention, but something that platform had going for it that Facebook lacks is the idea of circles. That you could add a friend to a particular circle based on how you know them, and then what you post to that circle would be seen by only those members. And the members of that circle wouldn’t actually know everyone else you were friends with, or what circle anyone was in.

Facebook lets you filter posts, but setting up various filters and maintaining them and making sure  you are posting the right thing to the right one is a lot of work. More work than I have space and time to be sure I’m getting right. And I think the same is true for a lot of people.

The reason Facebook’s algorithm is fucked up isn’t that it exists. It’s that Facebook forces you to have one large circle of friends instead of many small ones. We all have different identities and different roles that come out in different social groups. But Facebook forces you to choose one on-line persona.

And what that means is that every post you make on Facebook is a performance of that particular on-line identity. So sharing a political article, for a lot of people, isn’t based in an estimation of whether the information in that article is factually accurate (unless research and factual accuracy are a part of your Facebook persona’s values). It’s about communicating what is important to this on-line persona with the larger Facebook community. It’s about taking a stance and communicating that stance.

The reason people are sharing fake news on Facebook isn’t that we’re too stupid to research articles we read on the Internet. It’s that what that particular person is trying to communicate in that moment has no relation to the truthfulness (or falseness) of that particular information. Rather, sharing the article is a way to maintain a cohesive persona across multiple social identities.

The way to avoid a Facebook echo chamber isn’t to petition Zuckerburg to prevent fake news websites from posting on Facebook. It isn’t even just bringing back the real-time timeline which shows the most recent posts in your feed rather than relying on an algorithm (though that might help.)

The way to stop the Facebook echo chamber is to allow ourselves to be the complex and multi-faceted individuals we are. It’s to create circles of friends. That way, I’m not performing my identity as an activist by sharing a political thinkpiece article defending my choices; instead, I can share important information about local protests with my other activist friends who already take that identity for granted. And I can really think about what I want to communicate to my racist family members, and post on that filter accordingly. I can keep my chosen family separate from my coworkers, and both of those separate from the acquaintances who might only want to see memes from me or pictures of my cat.

If I can curate each post to its intended audience, I would likely choose a different political article to share with my liberal friends than the one I share with my conservative family, and the information I am communicating to both groups would be better suited to their needs as a result. Instead, I’m left trying to find one article that I can speak to all of me, or getting into arguments with Facebook friends who weren’t even the intended audience of that particular message. When communication has to be broad instead of specific, the position we take automatically becomes less nuanced and more divisive. If I am communicating who I am with what I share, I’m more likely to be on the defensive.

The election didn’t create this problem. It existed last year when people wanted to believe Meryl Streep had been turned down for a role for being ugly. It existed before then, too. The problem isn’t Trump or fake news websites, though both might be a symptom. The problem is that we all perform multiple identities every day, and Facebook makes us choose only one. So then the bulk of our communication on-line becomes a performance of identity, and the attempted creation of a cohesive self.

If we want to fix Facebook, we should all take a page from Walt Whitman – “Do I contradict myself?/Well, then I contradict myself/I am large/I contain multitudes.”

It’s okay if you share a Facebook article without fact-checking. Maybe facts aren’t even what you’re trying to communicate on-line. Understand what you’re using social media for, be honest about that, and accept that other people are using it differently. We don’t all have to fit into one box. It’s actually foolish to try.

Leave a Reply