WASHINGTON (AP) – Facebook owner Meta is quietly easing some security measures designed to thwart voting misinformation or foreign interference in US elections as November midterm voting nears.

It’s a sharp departure from the social media giant’s multi-billion dollar effort to increase the accuracy of posts about US elections and the company’s efforts to exploit people’s data and allow lies to dominate its site. After that, it is to gain the confidence of the MPs and the public. 2016 campaign.

Axis is raising alarm about the priorities of Meta and about how some of the world’s most popular social media platforms may be taking advantage of the world’s most popular social media platforms to spread misleading claims, launch fake accounts and fuel partisan extremists.

“They’re not talking about it,” said former Facebook policy director Katie Harbath, who is now CEO of tech and policy firm Anchor Change. “Best case scenario: They’re still doing a lot behind the scenes. Worst case scenario: They back off, and we don’t know how that’s going to manifest itself on the platform over the medium term.”

Since last year, Meta has closed an examination of how lies in political ads are amplified on Facebook by indefinitely removing researchers from the site.

CrowdTangle, the online tool that the company introduced to hundreds of newsrooms and researchers to identify trending posts and misinformation on Facebook or Instagram, has been dormant for a few days now.

Public communication about the company’s response to election misinformation has certainly been quiet. Between 2018 and 2020, the company issued more than 30 statements detailing how it would prevent US election misinformation, prevent foreign opponents from running ads or posts around votes, and divisive hate speech. will subdue.

Top officials hosted question-and-answer sessions with journalists about the new policies. CEO Mark Zuckerberg wrote Facebook posts via social media promising to remove false voting information and authored opinion articles calling for more rules to tackle foreign interference in US elections.

But this year Meta has only released a one-page document outlining plans for the election, even though the potential threats to the vote are clear. Several Republican candidates are pushing false claims about the US election on social media. In addition, Russia and China have continued aggressive social media campaigns aimed at further political division among American audiences.

Meta says that elections are a priority and that policies developed in recent years around electoral misinformation or foreign interference are now driving the company’s operations hard.

“With every election, we incorporate what we’ve learned into new processes and set up channels to share information with the government and our industry partners,” said META spokesman Tom Reynolds.

He declined to say how many employees would be on the project to guard US elections full time this year.

During the 2018 election cycle, the company offered tours and photos for its election response battle room and prepared head counts. But as The New York Times reported, this year’s election saw the number of META employees working out to be cut from 300 to 60, a figure that is META controversy.

Reynolds said Meta will pull in hundreds of employees working in the company’s other 40 teams to oversee the upcoming vote, along with unspecified workers on the polls team.

The company continues a number of initiatives to limit election misinformation, such as a fact-checking program launched in 2016 that enlists the help of news outlets to verify the veracity of popular lies spreading on Facebook or Instagram. . The Associated Press is part of Meta’s fact-checking program.

This month, Meta also rolled out a new feature for political ads that allows the public to find details about how advertisers target people on Facebook and Instagram based on their interests.

Still, META has thwarted other attempts to identify election misinformation on its sites.

It stopped revamping CrowdTangle, a website it introduced to newsrooms around the world that provided information about trending social media posts. Journalists, fact-checkers and researchers used the website to analyze Facebook content, including finding popular misinformation and who is responsible for it.

Leave a Reply

Your email address will not be published. Required fields are marked *