How toxic superusers determine what everyone else sees on Facebook

A small subset of hyperactive Facebook users, many of whom frequently spread misinformation or call for political violence, play an outsized role in determining what content other users see, according to a new piece in The Atlantic.

The article is based on research conducted by Matthew Hindman, Nathaniel Lubin, and Trevor Davis. Their research has not yet been submitted for peer review.

Their study observed 52 million Facebook users. Within this group, the top three percent of users were responsible for 52 percent of likes, shares, reactions, and comments. Facebook uses an algorithm called MSI (Meaningful Social Interaction), which assigns posts “points” based on how much engagement they get, to curate users’ news feeds.

Superusers account for the majority of interactions; ergo, superusers determine what more casual users see. 

The authors also studied a random sample of 30,000 users, which they then narrowed down to the most active one percent. Of these 300, 219 posted at least 25 public comments over a two-month period. Of those 219, 68 percent “spread misinformation, reposted in spammy ways, published comments that were racist or sexist or anti-Semitic or anti-gay, wished violence on their perceived enemies, or, in most cases, several of the above.”

“[S]o long as user engagement remains the most important ingredient in how Facebook recommends content,” the authors concluded, “it will continue to give its worst users the most influence.”

Hindman, Lubin, and Davis also accused Facebook of refusing to ban toxic superusers becuase doing so would negatively impact overall user engagment.

Facebook provided the authors with a statement disputing these conclusions, arguing that the authors’ research “seem[s] to fundamentally misunderstand how News Feed works. Ranking is optimized for what we predict each person wants to see, not what the most active users do.”

Read more at The Atlantic.

A small subset of hyperactive Facebook users, many of whom frequently spread misinformation or call for political violence, play an outsized role in determining what content other users see, according to a new piece in The Atlantic. The article is based on research conducted by Matthew Hindman, Nathaniel Lubin, and Trevor Davis. Their research has…

A small subset of hyperactive Facebook users, many of whom frequently spread misinformation or call for political violence, play an outsized role in determining what content other users see, according to a new piece in The Atlantic. The article is based on research conducted by Matthew Hindman, Nathaniel Lubin, and Trevor Davis. Their research has…