Right-wing pages suddenly disappeared from Facebook's top 10. But don't be fooled.
Angelo Carusone of Media Matters on why outrage bait thrives on Facebook, the problem of misinformation in comment threads, and more.
Check out Facebook’s Top 10 list of top-shared links on most days and you’ll understand why the platform has a reputation as a right-wing propaganda machine. You’ll see lots of stuff from right-wing personalities like Dan Bongino and Ben Shapiro and outlets like Newsmax and Fox News — with nary a progressive page to be found.
So when right-wing pages suddenly disappeared from the top 10 ranking on September 28 and were replaced by a variety of hard-news sites, people wondered if it reflected some sort of change to Facebook’s algorithm.
But don’t be fooled — right-wing content still dominates Facebook. As Media Matters pointed out on Twitter, the September 28 links rankings were skewed by breaking news of R. Kelly’s guilty verdict and as such represented “a good instance of links not telling the full story.”
Indeed, by October 5, Facebook’s Top 10 — compiled by New York Times tech columnist Kevin Roose, based on data from CrowdTangle, which is owned by Facebook — was once again dominated by the Bonginos and Shapiros of the world.
Right-wing content does so well on Facebook for a variety of reasons. As Judd Legum has detailed in Popular Information, Facebook’s algorithm has a preference for outrage bait that’s built for engagement. Facebook has also proven extremely sensitive to criticism from right-wing figures. Its vice president for global public policy is Joel Kaplan, a former George W. Bush administration official who is tight with Brett Kavanaugh and reportedly has intervened on behalf of right-wing pages during content disputes.
As Public Notice launches this week, I thought it would be timely to check in with the president and CEO of Media Matters, Angelo Carusone, to get up to speed on all things Facebook. Our conversation took place a few days before Frances Haugen, a former Facebook product manager-turned-whistleblower, testified to Congress about how the company prioritizes profits over the public good.
Public Notice provides independent, reader-supported coverage of US politics and media. Both free and paid subscriptions are available. The best way to support my work is with a paid subscription.
"Facebook's own research about Instagram contains quotes from kids saying, 'I feel bad when I use Instagram, but I also feel like I can't stop,’” Haugen said on Tuesday, in comments that prompted some truly unbelievable spin from the company.
As Carusone explained, Facebook prioritizes engagement above all else, from right-wing outrage to Instagram content that the company knows can be harmful for teens.
“Facebook is responsible for not applying their terms of service, their rules around authoritative content,” he told me. “The bottom line is, no doubt they're getting benefit from the extreme, from the outrage.”
A transcript of my conversation with Carusone, lightly edited for length and clarity, follows.
Aaron Rupar
I want to talk about the Facebook’s Top 10 issue a little bit. On September 28, the list suddenly didn’t include Bongino or Hannity or other usual suspects such as the Daily Wire. But it’s not like Facebook was intentionally curtailing right-wing content on September 27. It’s just that because there was some breaking news that day that was getting a lot of play, those sites got pushed out. Right?
Angelo Carusone
That's exactly right. And what [a one-day look at the top 10 list] doesn't tell you is whether it's a broader trend — what does the actual information landscape look like over time, and what's happening beyond those top 10?
[Facebook] can point to a moment in time. The analog to this is in cable news, where when there's a breaking news or a major crisis, CNN's ratings skyrocket. But it’s not because CNN is a cable news juggernaut in terms of their ratings. It's that in those moments they end up being the gravitational force because people want the information and some are tuning in that aren't typically engaged, so they get all the residual benefits. But if you were a media buyer or a cable company, you wouldn't peg your business investment and decision to those breaking news moments. You would look at the whole picture.
Facebook is asking us not to apply that same rubric when it comes to analyzing what’s happening on their platform as it relates to the types of information that are being privileged or advanced or advantaged. It’s important to Facebook for people not to ask those questions, because when you start to look at the bigger picture over time, the trends are really disturbing, they don't reflect well on Facebook's past, and they actually don't bode well for all of us going forward.
Aaron Rupar
What does that broader view of the Facebook top 10 list over the course of this year show us?
Angelo Carusone
What you'll find is two things of significance this year so far. One is that right-leaning content has increased its engagement and its traffic. It has gotten a larger share of the voice on Facebook, while non-aligned [pages] and news sources have seen a small uptick — a very, very, very modest, small uptick. Left-leaning sources dropped in terms of their reach. That's the first thing we’ve seen.
The second thing is that video, especially right-leaning video, has gotten an enormous amount more traction and reach. Things like Epoch Times are disproportionately outperforming others.
So basically the two takeaways are that right-leaning content on Facebook is expanding its share of the voice, and in terms of the delivery mechanism, there has been a lot more engagement around video as opposed to just straight links.
Aaron Rupar
Is right-wing content thriving even more this year on Facebook because of their algorithms, or is it more attributable to a dynamic where right-wing outrage bait is just something that people click on and share more?
Angelo Carusone
I think there are a few major ingredients that go into answering that question. The first is the content itself. Maybe they're just better at engaging on social media because the content they produce is more high valence and highly emotional, which is going to get you a stronger response. By virtue of being more extreme and more responsive to the way social media works, they're gonna get a benefit from that. That's one piece of it, absolutely, no doubt. But Facebook only has a small bit of culpability there.
Where Facebook does have enormous amounts of culpability is the authoritative-score side. Facebook is responsible for not applying their terms of service, their rules around authoritative content. You can post all kinds of really extreme stuff and if it's not coming from an authoritative source, it should actually be getting a slight bit of depression in the news feed. It shouldn't be getting highly recommended. But how much they are actually depressing the reach and engagement around non-authoritative content?
And then another thing is pay to play. I mean, Ben Shapiro's content disproportionately outperforms [other sites], and he also is one of the single biggest advertisers on Facebook. He’s spent millions of dollars on Facebook advertisements.
Public Notice provides independent, reader-supported coverage of US politics and media. Both free and paid subscriptions are available. The best way to support my work is with a paid subscription.
What ends up happening when you run Facebook ads, especially if you're a creator, is that you get residual benefits from paying for high levels of engagement. Facebook essentially promotes your next bits of content, even if you aren't paying for that, because there is a residual [effect]. When you apply that to things like Epoch Times and other deep pocketed right-wing sources, they put money into the system to promote their content, and they get a residual benefit that is largely invisible to most users.
The bottom line is if Google search results were as unreliable as Facebook's newsfeed when it comes to scoring things, Google would have gone out of business.
On the long-term side, the areas where Facebook has done the most to advantage the right is in helping them build new communications infrastructure. There's no reason why they should have been building the QAnon community, for example, at the speed that they were. QAnon groups had a more than 20 percent growth rate last year. Right-leaning pages only had a 2 percent growth rate, and that was considered to be really, really high.
This was not just the result of QAnon people. You're getting that because Facebook's recommendation engine is helping connect like-minded individuals. And then what ends up happening is they let them get away with breaking the rules. They don't undermine some of the content in the way that they should. They don't put any friction into the system to short-circuit their recommendation engines.
That’s where I think Facebook has the culpability, and the person that makes those decisions is a guy by the name of Joel Kaplan who is basically a Republican operative.
Aaron Rupar
What’s the situation on Facebook in terms of vaccine misinformation these days? The platform received a lot of blowback over the summer for how much this kind of content was thriving.
Angelo Carusone
This, I think, is, is a good illustration of how, when you look at how Facebook handles these issues, most of what you’re seeing in the policy realm are actually more public relations and performance than they are actual policies. So, for instance, when they were on the receiving end of all this pressure around the proliferation of vaccine misinformation in the late spring and early summer, they started to talk about, ‘Oh, look, Tucker Carlson is no longer our top piece of vaccine content.’ They also started doing more takedowns [taking down content].
So, at the top level, when you looked at some of those stats and said, ‘Okay, yeah, wow, they're making an improvement. They're taking this down, they shut down a bunch of these groups. The top or highly promoted content has changed.’
The problem with that is twofold. One, it takes most of the heat off of them, because now they can say, ‘Look, we did all these things,’ but when you go one level deeper what you find is that there's still a disproportionate amount of vaccine misinformation, it’s just being distributed one layer below the post. They're doing it in the comment section.
That matters, because if you have a piece of content that's right on that line of Facebook's [official policy] when it comes to vaccine misinformation, and it has 80,000 engagements and is getting circulated to hundreds of thousands or millions of people, and the top comment right under it is actually a total fabrication, a piece of anti-vax misinformation, or a link off the platform to get to anti-vax content — it's basically being seen by almost the same amount of people.
Also, you still see the mass proliferation of anti-vax-adjacent content. Somebody is promoting ivermectin, which is still wildly popular on Facebook, and when you're promoting that, implicit in the idea that you should be taking ivermectin is the idea that the vaccines are not working, that they're not effective.
So the post itself may not contain vaccine misinformation, but then they just wrap it around something that is even more toxic and destructive and that Facebook has not been as aggressive and vigilant about.
That's the lesson that we're seeing now with this top 10 thing. They get the heat, they put a little performance out there, a little public relations, but ultimately the problem morphs into something different, then you have to go through the same cycle all over again.
Aaron Rupar
Facebook drew lots of criticism when it was revealed it was partnering with a subsidiary of the Daily Caller [and later the Daily Caller itself] to fact-check news articles, and also when it classified Breitbart as a “trusted” news source. How are those decisions playing out?
Angelo Carusone
Those decisions were rooted in a very intentional sensitivity to mollifying right-wing critics. There's absolutely no reason to say Breitbart and the Associated Press are in the same category unless you're trying to inoculate yourself from criticism from right-leaning audiences that you're in the tank for the liberal media, and the same thing applies to fact-checking.
Keep in mind that when they [announced the partnership with the Daily Caller subsidiary], the attacks on Facebook for being anti-conservative were at a fever pitch. They had launched the anti-conservative bias audit. They had taken a couple of steps, including changing the trending topics section, that date all the way back to 2016, because Glenn Beck and Tucker Carlson and Laura Ingraham said Facebook was censoring conservatives, even though their internal data showed the opposite.
I also think that the current environment we're talking about — the imbalance of right-wing content versus other kinds, this proliferation of right-leaning misinformation — it is the result, to some extent, of all that past mollification of right-wing [sentiment]. They were trying to say to them, ‘Hey, look, we're not liberal,’ the same way that newsrooms in the late 90s were pushing climate skepticism because they didn’t want to deal with the idea that they were opposed to the right. They work the refs.
Aaron Rupar
On a related topic, how likely do you think it is that Trump will be allowed back on Facebook ahead of 2024? An announcement from Facebook’s oversight board in May left the door open to his posting privileges being reinstated as early as late this year.
Angelo Carusone
This is what we've been warning about. The way they approached this revealed their hand a little bit — that they intend to let him back on, absent some incredible pressure or some other compelling reason. And I think everybody missed this at the time or didn't really think about the language and what Facebook's announcement said, because there was too much schadenfreude about Trump not being allowed back on immediately.
But what they actually said wasn't that Trump had to prove himself to come back on, but rather the opposite — that Trump will automatically be restored unless there's some compelling public security reason not to. I think this is the most significant part of the announcement, and it got buried. The burden of proof is actually on Facebook or others to demonstrate that there's some public security threat; otherwise, he will be restored. And I think that specific language they used—the ‘will be restored’ thing—demonstrates to us where this is headed.
Aaron Rupar
Let’s end with a zoom out. Obviously after the 2016 election there were a lot of headlines about Russian interference on Facebook and misinformation on Facebook, and that led to some of the fact-checking that they now try to do. What lessons do you think Facebook learned from that, and what didn’t they learn that they should have?
Angelo Carusone
I think you have to acknowledge when meaningful changes have taken place. There's no doubt that Facebook has done some things that were not happening in the past. One of the biggest is that there are now moments of meaningful intervention, when they have been more engaged.
For example, Plandemic 1 [a viral conspiracy theory video about Covid] was a real problem. Plandemic 2, not so much. That's not an accident. It's because Facebook was proactive. The other platforms were proactive, too, but Facebook was proactive. And you can go down the list of meaningful intervention moments. You can make the case that there's not enough of them, and I think that's totally true, but I think the bottom line is that it’s significant.
The other thing they did that matters a lot — and I will give them credit for this the same way I would give some of the other platforms credit — is that we did not talk about deep fakes and manipulated media in 2020. And the reason is not because those things didn't exist. It's because the platforms, and Facebook was among them, with a lot of pressure — they didn't do this voluntarily — put in place a policy around not just deep fakes but manipulated media— preventing it, detecting it, and enforcing against it. Just think about how bad October 2020 would have been if a lot of the deep fakes about, say, Hunter Biden, were widely proliferating. Because they existed. They existed on other platforms, but they couldn't really get uplift and oxygen and takeoff and saturation because the platforms were actually proactive.
I think those are lessons learned. Some meaningful intervention moments, some broad-scale policy changes that prove that when they work, they work.
On the flip side, they still haven't learned that one of the biggest manipulation points in 2016 wasn't just Russian disinformation. It was the gaming of the platform itself by way of bots. Disinformation is cheating. It's not just misinformation, it's cheating. There's something inauthentic about it. And when you’re able to manipulate an algorithm to boost your content, that's disinformation. That's cheating. Bots were a big problem in 2020, and there's this perception that Facebook has been more aggressive about stamping out bots than they had been in the past. And the truth is, they haven't really been.
Earlier this year, Facebook announced that in 2020 they removed a billion bot accounts from their platform. A billion! That matters. Content gets affected by it, individuals and conversations get affected by it, advertisers waste money advertising to fake individuals. What I don't think they've done well is implement some front-end protections to slow down the utilization of not just bots, but sock puppet accounts more robustly. They're very happy to announce when they’ve done the cleanup after the fact, but by then the damage is already done. The more important thing that they haven't learned yet is how to do that front-end work, which, again, would maybe slow their growth, but it's going to make the platform a lot less susceptible to these kind of attacks.
FB could not continue to profit at its current levels with only the extremist and extremist adjacent percent of the population. There are tens of millions of ordinary Americans who refuse to close their FB accounts. It's literally the easiest and most effective form of protest in existence, yet people refuse to take this simple step to save democracy or even lives. With this reality, it's little surprise that FB continues to flaunt its own user rules and spread mayhem and chaos for dollars.
So, to make the other side of discussions go viral, we need to follow the model of high volume, outrageous, and repetitive snippets of ad-backed consortiums?
Has Media Matters noticed any change in traffic share to other apps , or is FB continuing to lead?