Facebook, Instagram Removed 2.7 Crore Posts in India During July: All Details Akshay, September 1, 2022October 26, 2023 In July, Facebook and Instagram, both owned by Meta, addressed a total of 27 million posts, as revealed in their monthly report. Operating as an intermediary under the IT Rules, Meta removed 173 million spam posts and 2.3 million posts featuring violent or graphic content. This action aligns with the government’s regulations for social media platforms, and Meta emphasized its commitment to compliance. According to the monthly report under the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, Meta removed 25 million posts on Facebook and 2 million posts on Instagram in July. The company routinely shares insights into its content moderation efforts. The report outlines that Meta took down 173 million instances of spam on Facebook, achieving a proactive rate of 99.6 percent. Additionally, proactive measures led to the removal of 110,000 posts related to hate speech, 2.3 million posts featuring violent and graphic content, and 2.7 million posts containing nudity and sexual content. Proactive detection rates for these categories were reported at 99.9 percent, 99.4 percent, and 96 percent, respectively. For Instagram, Meta highlighted the removal of over 900,000 posts related to suicide and self-injury, 22,000 instances of hate speech, and 370,000 posts with nudity and sexual content. Proactive detection rates for these content types were reported at 99.5 percent, 77.4 percent, and 96 percent, respectively. Under the IT Rules, Meta’s social media platforms are mandated to address user complaints through a grievance redressal mechanism. The company reported receiving 626 and 1,033 reports from users on Facebook and Instagram, respectively, asserting that it responded to all user reports. On Facebook, Meta resolved issues for 603 reports by providing appropriate tools and taking action on 9 out of the remaining 23 complaints based on company policies. Meanwhile, on Instagram, the company resolved issues for users in 945 cases using necessary tools, with action taken on 35 of the remaining 88 reports. Meta stated its intent to publish subsequent editions of the report with a delay of 30 to 45 days after the reporting period, emphasizing transparency in its ongoing efforts to maintain a safe and compliant online environment. web development news