At one time in the past, maybe before Nov. 8, Facebook felt pretty familiar and comfortable. It wasn’t always perfect: I’ve complained over the years about its quirks, its tendency toward creating an echo chamber where you only see what you want to see, and its massive privacy failings.
But for all its faults, Facebook is where many of the people I love and respect spend a lot of their time. It’s where we communicate, make jokes, share news, post photos of our families and adventures, sell stuff, wish each other happy birthday and seek help and advice when we need it. It’s an overly cozy home stuffed with nearly 1.8 billion people. Any expectation that it would eventually fade away like MySpace or other too-early-to-the-game social networks is gone. Facebook’s not going anywhere and neither are we.
Since the presidential election, however, those of us who spend lots of time there have been forced to confront a pretty disturbing truth: Facebook is not the place we thought it was. We had an illusion that this online homestead was a safe space, a comfy digital zone away from the ugliness of the real world. That’s gone now.
It’s not just the politics, though the week of the election brought out an ugliness on Facebook I think might be unprecedented on the social network. What had been simmering all year suddenly boiled over as presidential election cycle made online friends hostile and prompted many to mute or unfriend those whose political rantings were creating stress. Even for those who tried to put on a calm face and encourage everyone to be polite and respectful, things got painful. Within my own posts, friends I’ve known since high school bickered and name-called total strangers. Some people I know demanded anyone on their friends list who voted against their candidate break that friend link.
Because I’m Latino, I wondered if I was being particularly over sensitive to some posts that touched on immigration and the nation’s other racial issues. The term “Illegals” is a trigger for me, as is any claim that immigrants contribute nothing to the U.S. economy. I saw a lot of hateful anti-immigrant memes in November, enough that it made it easy to remove some long-time Facebook friends who were passing them along.
But I knew I wasn’t overreacting when I was outraged by racists posts that were apparently deleted by Facebook suggesting there might be an organized white-supremacy event in New Braunfels, the smallish central Texas town where my family lives.
I reached out as a journalist to Facebook, asking for more details about how these posts were handled and was met with a frustrating lack of information and transparency. The company confirmed that it does have a team that handles complaints about hate-speech posts, but as NPR discovered by flagging a number of posts and seeing what Facebook does to them, the social network’s outsourced efforts are sometimes woefully inadequate in balancing free speech with keeping hateful, abusive posts off everyone’s feeds.
It’s disturbing when posts like this are coming from people you’ve chosen to label as friends or neighbors you’ve always assumed shared your values. For many of us, it was eye-opening to see the kinds of things real-life acquaintances, community leaders, online-only friends and relatives were posting in the days before and after the election. I have no interest in only having online friends who share the same opinions and parrot the same ideas within an echo chamber. But I won’t abide people who live their lives rudely, lashing out at others and punching down at groups of people. I found that happens across all political stripes; every party has its poopers, outliers and crackpots.
And then there’s the fake news. For a long time, like anyone who spends a lot of time on Facebook, I’ve been disturbed by the number of click-baity, attention-seeking psuedo-news stories that are placed like landmines around stories from more legitimate news sources. I ignore them, but it’s all too easy to image a person clicking on anything that appears to be framed like a news story on Facebook and assuming it’s been vetted somehow. The reality is that many people only get their news through the filter of social media rather than directly from news organizations themselves. When the news is skewed at the source, how will they know?
Facebook’s chief executive Mark Zuckerberg has spent time since the election fielding questions about whether the social network should be held responsible for not curbing fake news sooner. Zuckerberg has said he believes the idea that Facebook may have influenced the presidential election in some way is ridiculous, but the sheer number of users on Facebook and the amount of time people spend there, absorbing information, would suggest otherwise. The CEO was in the weird position of laughing off the idea, while also touting the power of advertising on Facebook. It wasn’t a good look.
My prediction: Facebook will take its cue from an internal task force that has already been formed to tackle fake news and make some significant changes over the next six months to fend off these criticisms. It’s not just for the users: many advertisers will balk at being associated with bad, potentially dangerous information. I think Facebook will also have to decide whether it wants to devote significant resources to combating harassment and bullying on its network.
Many people have already decided Facebook’s current state is too much. Some have decided to take a complete break from the network, or at least sharply curtail the amount of time they spend reading and posting. Others did a mass purging of their Facebook circles in order to regain some control of their feed. But the overall mood I’ve seen on Facebook itself is a growing, uneasy distrust of the platform itself.
This is a place that, without a conscious effort, made it easy to pit large of groups of people against each other and has been slow to address complaints of abuse. It’s a place that has allowed misinformation to spread and poison the community’s hive mind. And in the way that we’ve ceded control to Facebook’s algorithm to show us the things we want to see for so many years now, we’re complicit. We helped Facebook get to this point.
The next six months will be a test for Facebook. Is it going to lead by example, cleaning up its act and making itself a more hospitable place for its users? Or will it continue on its path, denying its role as a media company?
When there’s mold in your house you can choose to stay and clean it up. But if it’s so bad that the only solution is to walk away and rebuild, what then? What if everyone you care about is still stuck inside?