In a significant move to comply with Australia's new social media regulations, Meta has removed access to Facebook and Instagram for over half a million children under the age of 16. This action comes as part of the government's efforts to enhance online safety for young users, and it aligns with the findings outlined in the document.
Meta's Account Removals
During the week of December 4 to December 11, Meta reported the removal of 330,000 Instagram accounts, 173,000 Facebook accounts, and 39,000 Threads accounts belonging to users under the age limit. The company began this process a week prior to the official enforcement of the ban on December 10, indicating a proactive approach to adhere to the new law.
Government Statistics and Concerns
Prime Minister Albanese's government is expected to release official statistics this week detailing the number of young users removed from various platforms affected by the legislation. However, Meta has expressed concerns regarding the effectiveness of the ban, stating that it is not achieving the intended goals of enhancing safety and well-being for young people.
Potential Risks for Young Users
The tech giant argues that the removal of these accounts may inadvertently isolate vulnerable teenagers from supportive online communities. Furthermore, Meta cautions that these young users might migrate to less regulated apps, potentially exposing them to greater risks without the safety measures that platforms like Facebook and Instagram provide.
In light of Meta's recent actions to enhance online safety for young users, a separate report has emerged detailing a significant data breach affecting approximately 175 million Instagram users. For more information, see details here.







