Facebook Has Secret Rank To Determine How Risky You May Be
As part of its overarching effort to stem the proliferation of fake news on the platform, Facebook executives have confirmed the company uses a clandestine algorithm to help assess the reliability of users.
Facebook sources say the new tool is designed to decrease the number of false reports and misinformation that other users are posting.
In an interview with The Washington Post, Facebook project manager Tessa Lyons explained the basic concept of the system, which rates users’ reputation on a scale of 0 to 1.
The algorithms used to attain this score are under wraps and have been in development for about a year, she said.
Lyons, who leads the effort to prevent the spread of false information on the social media network, said the company needed an approach other than reports received from other users.
She said it was “not uncommon for people to tell us something is false simply because they disagree with the premise of a story or they’re intentionally trying to target a particular publisher,” adding that the reputation score is meant to provide a broader overview of a user’s perceived trustworthiness.
According to Lyons, this new tool is one of a multitude of Facebook users to create what it believes is an overall assessment of the relative risk posed by individual accounts.
In an interview with the U.K. Sun, she explained that the algorithm tracks users’ behaviors on the site, specifically “how people interact with articles” shared on the platform.
“For example, if someone previously gave us feedback that an article was false and the article was confirmed false by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true,” Lyons said.
Instead of providing a precise portrait of profiles, she said the reputation score is meant to supplement other methods already in use or in development to fight fake news.
One Facebook source told the Sun that the Post got its central claim about the program wrong.
“The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong and the headline in The Washington Post is misleading,” the spokesperson said. “What we’re actually doing: we developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system.”
Lyons made it clear that the rating “isn’t meant to be an absolute indicator of a person’s credibility.”
While the news opened Facebook up to new criticism, one Harvard researcher said the company is in a catch-22 situation in attempting to assess the risk of its users.
Claire Wardle said the secretive nature of this rating “makes us uncomfortable” but is necessary.
She said, “the irony is that they can’t tell us how they are judging us — because if they do, the algorithms that they built will be gamed.”
Truth and Accuracy
We are committed to truth and accuracy in all of our journalism. Read our editorial standards.
Advertise with The Western Journal and reach millions of highly engaged readers, while supporting our work. Advertise Today.