Meta ramping up efforts to remove posts containing violence, misinformation about attack on Israel

Meta announced plans Friday to increase efforts to enforce its policies against violence and misinformation as the war in Israel stemming from Hamas terrorists' largest attack on the country in decades continues.

The company, which owns Facebook and Instagram, established a "special operations center" with experts, including those who speak Hebrew and Arabic fluently, to monitor the social media platforms and remove content that violates Meta's policies faster.

In the first three days of the war in Israel, Meta removed or flagged more than 795,000 posts in Hebrew and Arabic for violating its policies on dangerous organizations and individuals, violent and graphic content and hate speech, the company said.

More than 2,800 people have been killed in the war in Israel thus far, including at least 1,300 Israeli civilians and soldiers and 27 Americans. Thousands more were wounded in the violence, and many others have been taken hostage by Hamas and raped, tortured and murdered.

An Israeli woman holding a flag reacts to posters of kidnapped Israelis by Hamas on Oct. 15, 2023, in Tel Aviv, Israel. (Photo by Ilia Yefimovich/picture alliance via Getty Images)

An Israeli woman holding a flag reacts to posters of kidnapped Israelis by Hamas on Oct. 15, 2023, in Tel Aviv, Israel. (Photo by Ilia Yefimovich/picture alliance via Getty Images)

ISRAELI DEFENSE CONTRACTOR ELBIT SYSTEMS TARGETED BY PROTESTORS IN MASSACHUSETTS

Meta said Hamas is banned from Facebook and Instagram under its dangerous organizations and individuals policy.

"We want to reiterate that our policies are designed to give everyone a voice while keeping people safe on our apps," the company said in a statement. "We apply these policies regardless of who is posting or their personal beliefs, and it is never our intention to suppress a particular community or point of view."

"Given the higher volumes of content being reported to us, we know content that doesn't violate our policies may be removed in error," the statement continued. "To mitigate this, for some violations we are temporarily removing content without strikes, meaning these content removals won’t cause accounts to be disabled. We also continue to provide tools for users to appeal our decisions if they think we made a mistake."

The tech company also said it is working with AFP, Reuters and Fatabyyano to fact-check posts and move content with false claims lower in users' feeds.

US CEOS SHOW SUPPORT FOR ISRAEL AFTER HAMAS ATTACKS

The initiative comes after Meta CEO Mark Zuckerberg was asked in a letter from the European Union to be "very vigilant" about removing illegal content and disinformation.

EU commissioner for Internal Market, Thierry Breton, said Meta has a duty under the agency's new online regulations known as the Digital Services Act to take "timely, diligent and objective action" after the tech giant was informed of illegal content on its platforms.

Breton sent a stronger-worded letter on Tuesday to X, warning about the spread of "illegal content" and disinformation on the platform. The EU announced Thursday it would investigate X over how it has addressed "terrorist and violent content and hate speech" about the war in Israel.

There are increased concerns about posts on X than on other platforms given X owner Elon Musk's changes to content moderation since taking over the platform formerly known as Twitter for $44 billion last year.

Musk has scaled back removing content and users from the platform and reinstated banned accounts as part of his claimed commitment to free speech on the social media site. X has also changed the verification system, which now allows anyone to be verified if they paid for the platform's subscription service.

X CEO Linda Yaccarino responded to Breton's letter before the EU announced its investigation. She said the platform has removed hundreds of accounts connected to Hamas and removed or placed Community Notes labels, which are approved fact-checks from X users, on tens of thousands of posts.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Yaccarino also said the platform "redistributed resources and refocused internal teams" and is "proportionately and effectively assessing and addressing identified fake and manipulated content during this constantly evolving and shifting crisis."

"There is no place on X for terrorist organizations or violent extremist groups, and we continue to remove such accounts in real time, including proactive efforts," she said.

Get updates to this story on FOXBusiness.com.

Israel Hamas warFacebookSocial Media