facebook-pixel

Charlie Warzel: What Facebook fed the baby boomers

(Aaron Wojack | Illustration for The New York Times) What Facebook Fed the Baby Boomers —

In mid-October I asked two people I’d never met to give me their Facebook account passwords for three weeks leading up to and after Election Day. I wanted to immerse myself in the feeds of a type of person who has become a trope of sorts in our national discussion about politics and disinformation: baby boomers with an attachment to polarizing social media.

I went looking for older Americans — not full-blown conspiracy theorists, trolls or partisan activists — whose news consumption has increased sharply in the last few years on Facebook. Neither of the two people I settled on described themselves as partisans. Both used to identify as conservatives slowly drifting leftward until Donald Trump’s takeover of the Republican Party offered a final push. Both voted for Joe Biden this year in part because of his promise to reach across the aisle. Both bemoaned the toxicity of our current politics.

The feed goes on like this — an infinite scroll of content without context. Touching family moments are interspersed with Bible quotes that look like Hallmark cards, hyperpartisan fearmongering and conspiratorial misinformation. Mr. Young’s news feed is, in a word, a nightmare. I know because I spent the last three weeks living inside it.

Despite Facebook’s reputation as a leading source for conspiracy theories and misinformation, what goes on in most average Americans’ news feeds is nearly impossible for outsiders to observe. Tools like CrowdTangle, which track “engagements” with social media posts, are the best available means to understand what is popular on the platform, though Facebook (which owns the CrowdTangle) argues that CrowdTangle is not a reliable indicator for how many people saw a post.

After years of reading about the ways that Facebook is radicalizing and polarizing people I wanted to see it for myself — not in the aggregate, but up close and over time. What I observed is a platform that gathered our past and present friendships, colleagues, acquaintances and hobbies and slowly turned them into primary news sources. And made us miserable in the process.

In February, Mr. Young sold his small business and moved from Missouri to just outside Phoenix. The plan was to ease his way into semiretirement. A week later, the coronavirus pandemic ground his life to a halt. That’s when his time on Facebook began to skyrocket.

“It got to the point where I literally had to start leaving my phone in the other room,” he told me recently. “The problem is mostly that I don’t have anything to do.”

Mr. Young joined Facebook in 2008 as a way to reconnect with his high school classmates from Illinois. He reunited quickly with old friends and neighbors. It was exciting to see how people had changed. Occasionally, he’d get tagged in an old photo or his news feed would surface an old memory from his classic rock cover band, Archive.

It was a little voyeuristic, nostalgic and harmless fun. Before 2016, Mr. Young told me, he’d see the occasional heated disagreement. It wasn’t until the last few years that his feed really started to turn divisive.

He first noticed it in the comments, where discussions that would usually end in some version of “agree to disagree” exploded into drawn-out, conspiratorial comment threads. Political disagreements started to read like dispatches from an alternate reality. He didn’t enjoy fact-checking his friends or picking fights, but when a post appeared obviously untrue he had to say something.

His time on the site ticked upward.

“It’s like going by a car wreck. You don’t want to look, but you have to,” he said. He believes his feed is a perfect storm for conflict in part because he’s lived in both liberal and conservative areas of the country and throughout his life he’s lived, worked with and befriended all manner of liberals and conservatives. “I started referring to Facebook as my Hatfield and McCoy feed and laughed it off,” he told me, referencing the infamous family feud.

But then he noticed some of his friends start to post more political memes, often with no link or citation. When he’d try to verify one, he’d realize the post was fake or debunked by a news site. “Most times there’s no real debate. Just anger. They’re so closed-minded. Sometimes, it scares me.”

Scrolling through Mr. Young’s feed after Election Day, I found a number of these posts.

Mr. Young’s feed stood in stark contrast to the other Facebook account I spent time in. That feed belongs to Karen Pierce, a 55-year-old schoolteacher from Virginia. Ms. Pierce described herself to me as a “middle-child peacekeeper who is uncomfortable with politics.”

Unlike Mr. Young, she is not politically active on Facebook and never intervenes, even when she sees things she thinks might be conspiratorial or fake. As a result, her feed surfaced less politically charged content. The day after the election, the first post I noticed from a friend in her feed was a simple, apolitical exclamation: “It’s official! I make a damn good pot of stew!”

The political posts that appeared in Ms. Pierce’s feed were mostly anodyne statements of support for the Biden-Harris campaign peppered in between comments from fellow teachers frustrated by remote learning and an avalanche of cute dog photos and memes. Occasionally, a meme popped up mentioning Hunter Biden’s laptop, but most lacked the vitriol or the contentious commenter debates of Mr. Young’s feed.

Yet, in my conversations with Ms. Pierce over the last month, she expressed just as much frustration with her experience on Facebook as Mr. Young. “It’s so extreme,” she told me in mid-October. “I’ve watched people go from debating the issue to coming up with the craziest thing they can say to get attention. Take the whole anti-abortion debate. People started talking, then started saying ‘if you vote for Biden you’re a murderer.’ Now there’s people posting graphic pictures of fetuses.”

When I told her I hadn’t seen anything that extreme on her page, she suggested it was because of a three-month break she took from the platform this summer. “It got to be too much with the pandemic and the politics,” she said. The final straw was seeing people in her feed post QAnon adjacent memes and content. “There was a lot of calling Biden a pedophile. Or Trump voters posting pictures with assault rifles. It made me very uncomfortable.”

Like millions of Americans, Ms. Pierce logs onto Facebook to feel more connected. “I use it to see how people are doing,” she said. “I believe in prayer and sometimes I check to see who is struggling and to see who to pray for. And then, of course, you see some news and read some articles.”

It was when she was using the platform for news that she started seeing disturbing, conspiracy posts from people in her network. “It was so disappointing to realize the hate that’s out there,” she said. “I think that, whatever you believe is your right. I hate the negativity and meanness. Even worse is seeing somebody you think you know say something they wouldn’t normally say.”

She’s worried about the long-term effects of such a toxic environment. “I think it’s affecting the mood of everybody.”

______

Living inside the Facebook account of strangers — even with their permission — feels invasive, like poking around in their medicine cabinet. But it offered me a unique perspective. Two things stood out. The first is the problem of comments, where strangers, even in the most mundane of articles, launched into intense, acrimonious infighting. In most cases, commenters bypassed argumentation for convenient name-calling or escalated a civil discussion by posting contextless claims with no links or source. In many cases, it appeared that a post from one user would get shared by a friend into his or her network, where it would be brigaded by strangers.

The more I scrolled through them, the more comments felt like a central and intractable issue. Unlike links to outside articles, comments aren’t subject to third-party fact checks or outside moderation. They are largely invisible to those people who study or attempt to police the platform.

Yet in my experience they were a primary source of debunked claims, harassment and divisive rhetoric. I showed one comment thread to a colleague who doesn’t use Facebook and my colleague found it shocking. “Facebook created a town hall for fighting,” they said. “It’s almost like if you were building a machine to make a country divisive and extreme — if you were to sit down and plan what that would look like —- it would be this.”

But Facebook wasn’t originally built this way, it evolved intentionally. And that evolution, from a friendly social networking site into the world’s largest information platform, is the source of its biggest problems.

Sifting through Mr. Young and Ms. Pierce’s feeds and talking to them about what I saw, it became clear that the two found themselves tormented as a result of decisions they made in their early days on the platform. Both explained that they joined to reconnect with old friends.

Like most of us, they gave little thought to the connections they made. Mr. Young added friends he hadn’t spoken to in decades. When Ms. Pierce joined a nonprofit organization she accepted dozens of friend requests — some from people she’d met only in passing. “I meet people on airplanes all the time and we exchange Facebook handles,” she told me.

But as Facebook evolved, these weak connections became unlikely information nodes. Mr. Young and Ms. Pierce were now getting their commentary from people they hardly knew, whose politics had once been unknown or illegible.

“When Facebook first started it made me feel so good. It feels like I signed up for one thing and it’s become something totally different,” Ms. Pierce said.

It’s a detail that’s often lost in the discussion around baby boomers and Facebook, which can sometimes unfairly malign the generation as clueless dupes and vectors for disinformation.

Joan Donovan, the research director of the Shorenstein Center on Media, Politics and Public Policy at Harvard’s Kennedy School, described this phenomenon as what happens when “social-networking sites transformed into social media,” creating “a digital economy built on engagement.” Dr. Donovan argues that this decision spawned the algorithmic echo chambers we now live in and created a fertile environment for our information crisis.

For Mr. Young, the fallout of these decisions is painful. After weeks of watching his feed, I presented him with some of the most notorious posters in his feed. When I read aloud the name of one Facebook friend who constantly shared debunked claims, often with language intended to provoke, he sighed. He described the person as a longtime friend and neighbor who was once so close they practically lived at each other’s houses. Now, he spends his time debating whether it’s worth the energy to stop the friend from sharing conspiracy theories.

“I still love this person,” he said. “We could get together, I hope, and tip back a few drinks. But almost overnight I was like, ‘what happened to them? Who is this person?’ It’s made me so sad for the last year or two. They’ve fallen into almost a trance. I’m so disappointed by it.”

The psychological toll of watching friends lose touch with reality has both Mr. Young and Ms. Pierce re-evaluating their choice to spend so much time on the platform. Mr. Young, for his part, tried to stay off during election week; Ms. Pierce is heartened that her feed has become less toxic after her Facebook sabbatical and is planning another. “My emotional and mental state improves greatly the further away I get from this place,” she told me.

Even if both manage to stay away from Facebook for good, their stories are just two in a sea of billions. No story is the same because no feed is the same. And yet these same dynamics that tortured my two participants — a sea of contextless news and acrimonious comments revealing their neighbors’ worst selves — are on display for millions of Americans every day.

That thought bounced around my head this month as the election returns trickled in, revealing in unmistakable detail the footprint of a divided nation. For those expecting or at least hoping to see broad consensus, the results were a gut punch. It’s difficult to imagine that a significant number of Americans see the world through different eyes and, in many cases, are constructing an alternate reality. It’s confounding to pundits, pollsters and our neighbors alike.

But seen through the lens of our online lives, such an outcome makes perfect sense. While it is uniquely difficult to map or quantify, what is impossible to ignore is that how we behave on platforms like Facebook shapes and reflects a core expression of our identities.

Ben Collins, an NBC News reporter who covers disinformation and online extremism, argued recently that one takeaway from our recent election ought to be that it’s “well past the time to start realizing that what people say on Facebook and in comments sections is what they actually mean.” The fights, the jokes and the memes are all an expression of legitimate feelings — all the more authentic because, unlike getting a call from a pollster or talking to a reporter, we often feel as though nobody’s watching.

Despite spending years studying these toxic dynamics and the better part of a month watching them up close in strangers’ feeds, I was still, like so many, surprised to see it all reflected at the ballot box. We shouldn’t have been surprised; our divisions have been in front of our faces and inside our feeds this whole time.

(New York Times Photo) Charlie Warzel | Opinion Writer-at-Large The New York Times

Charlie Warzel, a Montana-based Opinion writer at large for New York Times, covers technology, media, politics and online extremism.