Earlier this year, Meta Platforms Inc. quietly convened a war room of staffers to address a critical problem: virtually all of Facebook's top-ranked content was spammy, oversexualized or generally what the company classified as regrettable.
Meta, Facebook's parent company, had historically been reluctant to judge what goes viral on its platform, trusting its recommendation systems and users to surface the best content.
But the company's executives and researchers were growing embarrassed that its widely viewed content report, a quarterly survey of the posts with the broadest reach, was consistently dominated by stolen memes, engagement bait and link spam for sketchy online shops, according to documents viewed by The Wall Street Journal and people familiar with the issue.
Amid plans to shift even more of its users' newsfeeds toward Facebook-recommended video content, Meta feared that it risked alienating users and advertisers, according to those documents and people.
Over several months, members of Meta's product, user-experience and integrity teams hammered out better definitions for low-quality content and agreed on ways the company could avoid amplifying it, according to the documents and people.
The work produced measurable results.
Facebook's third-quarter Widely Viewed Content Report, released on Tuesday, shows only one in the top 20 posts qualified as engagement bait, down from 100% a year earlier.
For the first time since the report began being produced, none of the top 20 posts violated platform rules.
The content that did receive top billing on the platform was a mixture of celebrity news, meme pages and Reels videos.
Selections include a video from Thailand of people giving CPR to an elephant, a page devoted to feel-good quotations about surviving domestic violence and a Reel in which a delivery man befriends a skittish dog.
Among the most risqué offerings was a story that originated not on social media but in the New York Post, titled "Woman with world's 'most tattooed privates' hits out at haters."
"We're cautiously optimistic of the progress we've made as we work to improve the quality of content within Facebook," said Anna Stepanov, head of Facebook Integrity, in an announcement of the report.
Ms. Stepanov called Facebook's public release of the information an effort "to hold ourselves accountable."
Facebook's struggles with this issue highlight the complexities of content moderation across social media, including at Elon Musk's Twitter, where he has said eliminating spam is one of his top priorities even as he wants the platform to be a bastion for free speech.
Facebook's progress on low-quality content represents an upbeat turn for the quarterly report, which the company began producing in 2021 to counter the narrative that it was amplifying divisive posts that regularly dominated daily engagement tallies.
What it showed instead was that among the billions of posts made to the platform, the ones getting the most views -- meaning they were actively promoted by the social network's recommendation systems -- were generally trashy. Many tended to be plagiarized, promoting seemingly random links and coming from anonymous pages.
In the third quarter last year, 70% of the top 20 most-viewed posts met the company's existing definition of being "regrettable," a report from last year showed.
A later analysis determined that the remaining 30% was also low-quality engagement bait. Many of these were posts that gamed the newsfeed algorithm by asking for a response, such as a meme labeled "Post the four words every girl wants whispered in her ear."
Surveys showed that users didn't like their feeds to be dominated by such posts, and the company was concerned that the problem was about to get worse.
The company was preparing to shift users' newsfeeds toward Reels, a short-form video format that relied more heavily on Facebook's recommendations for distribution.
While Reels were less prone to engagement bait, the videos that Facebook was amplifying most heavily featured other problems: bootlegged media, "aggression" and oversexualization.
"They were pretty low quality in ways that weren't captured by the "regrettable" definition," an analysis found.
Most of the top videos, which were from sources that Facebook users hadn't chosen to follow, were of people physically or verbally fighting.
In a statement to the Journal, Ms. Stepanov said that the widespread adoption of Reels had required the company to focus on short form video-specific content quality measures.
"We remain committed to this work and hope to continue making progress in this area," she said.
As part of its efforts in the new "Content Quality War Room," the company sought to better identify what made users feel a post was trashy.
The effort homed in on finding ways to measure "un-aesthetic attributes, unoriginality, low integrity, and 'low-calorie' content," as one director later wrote.
Facebook also built a separate system specifically for identifying low-quality content in Reels, which included not just rule violations such as hate speech, nudity and sexual solicitation but whether a video has bad lighting, is gross or is a waste of time.
"Reels quality was not great when it launched but now it's not bad," an internal analysis of the most recent content report states.
In a public summary of the report released Tuesday, Facebook said the report reflected better detection of spammy content, the adoption of spacing rules that limit how frequently suspected engagement bait can appear in user newsfeeds and efforts to crack down on entities "who pretend to be in one country while actually being in another one."
How far the quality push goes remains unclear at a time of reduced staffing and leaner budgets at Meta, which recently laid off 13% of its employees.
Observers said the improvement in the quarterly report was notable.
"They should be proud of this," said Jeff Allen, a former data scientist at Facebook who now works at the Integrity Institute, a nonprofit that advises policy makers and companies on social-media platform design and governance.
While the top content on the platform accounts for just a fraction of a percent of what users consume, Mr. Allen noted, it shows what tactics and types of posts are rewarded by its current newsfeed algorithm.
The group previously has been critical of the type of content Facebook was most heavily promoting, including that 10% of the top posts had been removed due to violations of platform rules.
"When the number one most viewed Page on the platform is removed for violations, that should be a genuine 5-alarm fire," the group noted in a previous report, calling it "a genuine failure to build ranking and distribution systems that are in alignment with the company's mission and values."
Documents reviewed by the Journal suggest that Facebook has more work to do.
An internal analysis of low-quality content this summer found that, while the quality of the top 100 posts had improved significantly, the company made less headway in the top 500 and top 1,000.
The Widely Viewed Content Report also doesn't include most of the content in Facebook Groups, which have long been a source of trouble for the company's integrity team.
The company's researchers recently identified "Low Quality Content Producers" as being the most prolific posters and commenters in groups, documents show.