
Facebook’s parent company Meta appears more interested in avoiding ‘provocative’ VIPs than balancing the thorny issues of free speech and safety, the oversight board said. .
Josh Edelson/AFP via Getty Images
According to Meta’s oversight board, Facebook and Instagram’s programs that give special treatment to celebrities, politicians and other high-profile users are less about protecting users’ freedom of expression than they are about the business interests of parent company Meta. does much for the benefit of
“Meta’s cross-check program prioritizes influential and powerful users of commercial value to Meta, structurally fails to meet Meta’s human rights responsibilities and corporate values, It has a serious impact on society,” said Thomas Hughes, Director of the Oversight Board.
The committee said Meta appears more concerned with avoiding “provoking” VIPs and avoiding censorship accusations than balancing the difficult issues of free speech and safety.
Tuesday’s report called for an overhaul of the “flawed” program and included broad recommendations to bring the program in line with international principles and Meta’s own stated values.
In a statement, Meta said it would review and act on the non-binding board recommendations within 90 days.
The report was highlighted by Meta’s Board of Directors (a group of legal, human rights, and journalism experts from around the world convened by the company and funded through independent trusts), a “cross-check,” a whistleblower revelation. was wall street journal.
Under this program, Meta maintains a list of users including politicians, celebrities and business partners (advertisers, medical institutions, news media, etc.). Submissions by these users are subject to additional review if we suspect they violate our company policy against violence or hate speech. , misinformation, nudity, or other subject matter. In some cases, their posts are completely exempt from meta’s rules.
documents provided to wall street journal Former President Donald Trump and his son Donald Trump Jr., Democratic Senator Elizabeth Warren of Massachusetts, conservative activist Candice Owens, Brazilian football star Neymar and Meta CEO Mark Even Zuckerberg has been revealed to be on the VIP list.
Council finds ‘cross-check’ intent to protect the weak, but actually benefits the powerful
Meta says the program aims to address thorny problems.
Facebook and Instagram users create billions of posts every day. That means the company relies on a combination of human reviewers and automated systems to find and remove content that violates its rules. But given its scale, it’s inevitable that some posts will be falsely discovered as violating our policies. The company calls this a “false positive.”
Cross-checking aims to reduce false positives where the risk of error and potential harm is greatest.
“The cross-checking system was built to prevent potential over-execution errors and double-check when, for example, a decision needs more understanding or the risk of making a mistake is high,” said Meta’s Vice President. says Nick Clegg. As examples, he said, you may see posts from activists raising awareness of violence, journalists reporting from conflict areas, and “high-profile” pages and profiles. many people.
However, the Commission said in its report, “While Meta characterizes Cross-Check as a program to protect vulnerable and critical voices, it is more directly structured to meet business concerns.” It looks like it’s been adjusted and adjusted,” he said.
For example, based on its research, the Commission found that the primary reasons for how Meta reviews submissions from Cross-Checklist users were complaining to senior management and “publicly speaking” against Meta. It said it was “to avoid provoking people” which could “cause a lot of controversy.”
“Associating cross-check top priorities with concerns about managing business relationships suggests that the consequences Meta wants to avoid are primarily business-related and not human rights-related,” the report said. .
Additionally, the company appears to prioritize avoiding “censorship awareness” over other human rights responsibilities, the board said.
Despite Meta’s statement that everyone must follow the same rules, the committee basically concluded that the Crosscheck program treats users unfairly.
“Cross-checking protects certain users more than others,” the report said.
Additionally, the program is over-representative of users and content in the United States and Canada, even though the majority of Facebook’s nearly 3 billion monthly users live elsewhere.
“Crosscheck’s design allows users in lucrative markets with a higher risk of impacting Meta’s publicity to enjoy more rights to protect their content and representation than elsewhere,” the commission said.
This inequality is exacerbated by the lack of transparency about who participates in the program. The commission said Meta would not share which profiles, pages or accounts (what the company calls “entities”) are included in the cross-checks, citing legal obligations to protect user privacy. .
“If the board does not know exactly how the program is being implemented and who benefits, the board should not be concerned about how well the company is meeting its human rights responsibilities under the program or if enhanced reviews are required. It is not possible to fully assess the profile of an insured entity,” the report said.
The committee noted that over the past year, Meta has expanded cross-checking to review content that may be at high risk of being mistakenly removed, even if the contributor was not on the cross-checking list. did.
However, due to the company’s limited ability to review content, many of these posts do not get the same level of additional review as posts from prominent users of the program.
56 million views before football star Neymar’s rule-breaking post is taken down
The report also denounced Meta’s policy of keeping potentially rule-breaking posts from high-profile users visible during review. On average, a cross-check review takes him 5 or more days. The longest case Mehta shared with the board took seven months.
“This means that content cross-checked and identified as violating Meta’s rules will remain on Facebook and Instagram when it is most viral and potentially harmful,” the commission said. is writing
One infamous example cited by the commission was a video Neymar posted in 2019 that contained a photo of a woman who accused him of rape. The video got her 56 million views on her Facebook and Instagram before it was taken down. wall street journal report.
In addition to the inadequate resources to review posts flagged under the cross-checking program, the committee found that Meta urgently and appropriately reviewed potential “high severity” violations. I criticized him for not acting.
“In Neymar’s case, non-consensual intimate images posted on accounts with more than 100 million followers would be at the forefront of the queue for rapid, high-level review if there was a prioritization system. “It’s hard to understand why it didn’t go up to ..well,” the report said.
“Given the serious nature of policy violations and the impact on victims, this case highlights the need for Meta to take different approaches to content awaiting review and shorten review timelines. I will.”
Additionally, the board found that Mehta did not apply the usual rules to Neymar after the incident.
“The company ultimately clarified that the only consequence was the removal of the content, and the usual penalty was the deactivation of the account. Announced.
Boards want more transparency
Finally, the board indicated whether the additional layer of review provided by the cross-checking program would make Meta more accurate decisions on whether to keep or stop posting than the company’s normal enforcement process. You said you don’t track metrics.
“Meta has not considered whether the final decision was the right decision,” the report said.
The Board issued 32 recommendations on how the program should be improved to address the deficiencies identified by Meta and meet its claim to protect human rights.
This includes publishing criteria for participation in programs, allowing applications to participate in programs, and publicly labeling some accounts, such as government officials, political candidates, and business partners.
The company should also exclude accounts that repeatedly violate its rules from cross-checking and devote more resources to reviewing content.
“Meta has a responsibility to address the challenge of content moderation in a way that benefits all users, not just some,” the report said.
Meta says it has improved its cross-checking system over the last year, including developing “standardized” principles and governance standards, limiting the number of employees who can add users to the program, and creating a process for reviewing and removing users. said.
Copyright 2022 NPR. For more information, please visit https://www.npr.org.