Latest Facebook Scandal And The Bigger Picture
Updated: Oct 26, 2021
Facebook, scandal. Two words that go together like peanut butter and jelly, really. They've survived so many scandals with relatively little effect on their bottom line. At this point, the latest scandal seems to have gone in one ear and out the other for a lot of people - a simple headshake and "Facebook, y'all play too much" and we were on to the next thing.
But this scandal might be more than you realize. And maybe we need to be paying better attention. I contend that Facebook, through the methods outlined by the whistleblower, not only tried to maintain higher user engagement to make money - they take a significant amount of blame for allowing the QAnon/Trump conspiracy/red pill insanity to infect vulnerable people.
People Sinking Into QAnon Theories
For the last few years, people have bemoaned losing their loved ones to a sea of bizarre and ridiculous conspiracy theories. Good, loving people who were suddenly immersed in QAnon, refusing to accept any evidence to the contrary and cutting off family who contradicted their fantastic web of misinformation.
I've read the pieces, one after another, painting a dismal portrait of an entire population of people so far down the rabbit hole, it seems unlikely they'll ever come back up for air. But I noticed, even back then, the one thing that seemed to come up time and time again. They make note of how it spread primarily through social media, but whenever personal accounts are given, it's almost always one specific kind.
Yup, that's right. Facebook.
It's interesting to see that we already (mostly) understand how QAnon takes hold of these people. These are (presumably) ordinary people, who get one bizarre link or "hot take" shared on Facebook. It's someone they know, maybe a bit outside of their circle, but still familiar enough to trust.
It's always something urgent, enough to galvanize any decent person into a call for action. Accusations of misappropriated funds or pedophilia are common themes. This person wants, of course, to know more. So they check one thing, which sends them to another, and another, and another.
Going Deep Into "Anti-Deep State"
Once someone has been exposed to some sort of QAnon "fact," it's almost a given that they're just going to keep pulling that thread. They begin following specific pages and sites, creating a strange echo chamber where any information that doesn't agree must inherently be false. Because they have enough people surrounding them that do agree, and you can't all be wrong, right?
We know now and have for a while, that it's prolonged exposure to these bizarre ideas that create a mind fertile for indoctrination. They slowly limit your worldview, using a spokesperson who tells you to discredit everything you hear but them. You're exposed to the same falsehoods over and over, eventually hearing nothing else (they told you to stop listening, remember?). After a while, even these obvious lies sound like the truth. It's a crazy, manipulative gaslighting effect that's obvious to those who have carefully avoided it.
But now it seems that Facebook actually encouraged this prolonged exposure. No wonder it took such a foothold there. I gave up Facebook because I was tired of seeing "normal" people I used to know, people I didn't recognize now as they spouted bizarre QAnon theories and turned red in the face every time they were contradicted.
But, if the reports are true, Facebook is responsible for a large share of this. A cancer was spreading that took time and exposure, and Facebook seemingly rigged these people to have exactly the time and exposure that was necessary. Because then they were in deep, needing Facebook to feed. Facebook made more money - and according to the whistleblower, they knew it.
What's Different About Facebook?
You may argue, "But other social media can do this too, you can't blame Facebook." And that's true, to a point. But I don't see as many people blaming Twitter or Pinterest, and I think there's a reason.
I evaluated a marketing chart, designed to compare the differences in social media users for marketing purposes. There's no expectation of bias here - this is unrelated to any accusations against the company. They exist solely to help marketers know where to find their "target audience." Then I compared Facebook to other social media and drew some conclusions:
The Average Facebook User's Age
The most noteworthy difference about Facebook? The users' age. Almost every social media site shows a drop-off in use correlated to age. The 18-29 demographic is the biggest user of sites like TikTok or Twitter, and usage decreases as we look at older populations.
Facebook, on the other hand, is used most by people ages 30-49. Even in older groups, it's still quite common. 73% of adults age 50-64 use it, and even half of the people age 65 and older.
In my opinion, it's arguable that this population is the most likely to fall for misinformation on the internet. To take the bait. The younger generation understands that social media is for entertainment, not education.
Younger people know that you can't believe someone making a post just because they call themselves "Mr. Professor." It's the older population, who didn't grow up with the Internet Beast, that doesn't understand how to defend themselves. These were the people that couldn't tell joke SNL commercials weren't for real products; these are the people now who can't tell now when a wanna-be-entertainer is manufacturing a video for clicks.
I have one friend in particular who fell for it, hook, line, and sinker. I haven't talked to him in years now, but I did return to my old Facebook just to glance through his profile. It's full of "shares" from people who are only famous because of this weird conspiracy theory notoriety they've gained. They're worshipped on Facebook by people like my friend, who applaud them for "telling it like it is."
Imagine my (not) surprise when I looked up these people, these supposedly "reputable sources" who come with no real explanation as to why they're credible. To find, time after time, these are people with a net worth in the millions. These pseudo-celebrities, with internet notoriety grown on Facebook, are worth millions. These people are convincing my friend, who lives off government aid, that the rich are stealing his money through some sort of insane web of evil democrats. It would be hilarious if it wasn't so damn sad.
How Facebook Works
There is one more flaw that I think is inherent in Facebook, and less prevalent in many other forms of social media. It's the convenience at which you can curate your list of contacts, and how easy it is to surround yourself with like-minded theorists.
Sites like Twitter make it easy to search for things based on hashtags or news, without necessarily cultivating who you hear from. For example, if a certain musician is trending, you'll find both people who like them - and hate them. Even without searching, your feed contains "tweets" from people you follow and complete strangers.
Twitter allows you to announce your thoughts, mostly to the world at large, and see the response. Yes, I realize there are privacy settings if you choose them. But it was designed for more random, scattered social interaction - shouting into the void to see who shouts back.
Facebook, on the other hand, was always meant to keep you in touch with specific people you chose. You don't really go looking for strangers on Facebook, you find them by name. If you don't try to find them, you'll never interact with people outside of your "chosen" group. Once upon a time, it was a place to make a list of friends from college, or your fellow PTO moms.
But now, this same design makes it easy to have a friends list that is primarily people pushing QAnon. You add one person, they suggest 10 more, then they suggest 10 more, Suddenly this is your entire world view and it seems normal because it's the world view of all these "friends" you talk to every day as well.
Facebook Implies It Offers Credibility
For some time now, Facebook has been walking a bit of a tight line in censoring posts for accuracy. It makes sense as an attempt to cut back on the spread of misinformation. But I think it backfired. When you post a warning underneath certain posts informing people that "this information is false," there's a certain assumption that any information WITHOUT a warning must be true.
But that's simply not true. Facebook can not and does not monitor every post made. And as we've already said, Facebook encourages a certain amount of privacy and anonymity. As a result, entire groups exist that are almost entirely false information - but no one outside the group ever sees it, or reports it. As a result, all of these "unchecked" statements aren't marked as false - but they often are.
It backfires on Facebook, no matter which way it goes. If it gets marked as false, theorists moan that Facebook wants to control free speech. And if they don't label falsehoods, it seems to suggest it could be true. I'm not sure it was worth doing or really had any benefit, and it may be better to follow the example of most other social media. Other sites allow you to report hate speech or threats, for example. But they make no attempt to discredit "shares"- forcing the users to figure it out themselves.
Facebook is also (to my knowledge) the only site that tries to mandate its users use their own name. Again, this creates a need for secrecy and privacy, which seems to be the inherent exploited flaw of Facebook. It also, again, creates the illusion of authority. No one would be lying under what seems to be their birth name, so they must really believe everything they say! On sites like Twitter, there's no imperative to believe anything you're told by some guy who calls himself "Pumpkin Ska." Celebrities can, of course, be verified as authentic, but you have to be a known public figure in the first place.
Mark Zuckerberg has adamantly denied that Facebook ever used deliberate tactics to drive Facebook users to engage for longer periods of time. The whistleblower says differently. I suppose we won't know, for sure, until we see how Congress proceeds with the matter.
But it was obvious that long periods of engagement on social media were a motivating force for those who got ensnared in this "deep state conspiracy web." Enough people were speaking out about this very matter. Facebook might not be the monster itself. But it was the breeding ground. It's hard to believe Mark Zuckerberg can argue that he didn't know - he was feeding it.
Jamie Dixon is a contributing writer here at The Pyrrhic. She's a content writer by profession, but this is more fun. She's also working on her first novel in her spare time.
Find her on Twitter: @onegirloneblog