No one has embraced “new year, new me” as boldly as Meta. Within the first weeks of 2025, Mark Zuckerberg, Meta’s CEO, announced Meta would be removing fact-checking in the U.S., diversity, equity, and inclusion (DEI) policies, and changing the Hateful Conduct Policy which now allows posts calling LGBTQ people mentally ill.
Western Sydney University’s (WSU) Associate Professor Tanya Notley – whose work focuses on digital inclusion and media literacy – says these changes are a “strong signal” from Meta.
“It’s clear that Meta doesn’t accept that technology must be designed with different needs and cultures in mind,” says Notley, who leads the Advancing Media Literacy research program at WSU.
Marginalised groups such as those disabled, and people of colour and First Nations people, are already more likely to experience hate speech and targeted attacks online. The removal of these protections will only exacerbate existing inequities.
When asked to comment on these changes by Meta, Prime Minister Albanese stated in a TV interview on ABC News Breakfast:
“The social media ban is an Australian policy in the interests of young Australians, Australian families….This is a sensible reform that has passed the parliament and is now Australian law.”
In December 2024, the Albanese government announced a social media ban for users under 16, set to take effect after a trial period of 12 months.
While the ban is framed as a protective measure for young people, Notley warns that such restrictions can lead to young people “taking things underground and preventing people from talking” about what they see online.
As a woman of colour who has struggled with mental health, I’ve experienced the duality of social media firsthand. Platforms like Instagram were lifelines, offering access to communities that validated my experiences and helped me navigate challenges.
At the same time, they could become toxic spaces filled with harmful content that exacerbated my struggles. However, what would have made this a more positive experience was regulation of the content, not a ban, which would’ve taken my only avenue for support at the time.
I can’t help but think that Meta’s decision to deregulate, and the Albanese government’s decision to ban social media, are not all that different when they both limit the ability of marginalised voices to use social media safely.
Whose voices are lost?
We lose voices like Anjali Sharma, a 20-year-old Indian Australian climate activist who last week wrote an open letter signed by several prominent Australians to the government urging it to take climate action seriously.
She says that ocial media meant “I was able to educate myself and find resources” and was “pivotal” to her journey as a climate activist role.
Sharma joined the climate movement through social media at the age of 14. By 16, she was suing the Federal Environment Minister for failing to protect young people from the future harms of climate change.
At 19, she launched the Duty of Care Campaign, and working with Senator David Pocock, she introduced a bill to Parliament, hoping to legislate a duty of care.
Sharma says the campaign’s philosophy is “empowering young people to be part of politics.”
“In 2023, we put a call out to young people for a lobbying day in parliament, and social media was how we got 26 young people who were 15 and 16-year-olds to meet with over 50 members of parliament.
”While it’s all good for politicians to speak [on climate change], we need young people’s perspective as they are on the frontlines.”
Sharma says that minority voices are the most likely to be targeted by Meta’s removal of fact-checking, and the social media ban in Australia.
It is more than just about political engagement; social media is a tool of connectivity. Sharma moved to Australia at age eight, and she says that she kept in touch with her cousins through her teen years through “Instagram updates”.
“I felt like I couldn’t call my family on my own, as I do now, so Instagram was how I stayed in touch.”
As a second generation immigrant myself, and someone who moved countries twice and primary schools five times, I couldn’t agree more with Sharma.
Social media allows so many of us who have family overseas or are minorities in our suburbs to stay connected and have community no matter where we go.
The ban could lead to young people isolated and when they do start using social media, confronted with dealing with dangerous misinformation and no regulation to keep them safe.
But if the removal of fact-checkers is only applicable in the US, why does it matter to Australia?
Hannah Ferguson, founder of Cheek Media, an independent women-owned media company, and author of Taboo, fears “we will see a more immediate international transformation for Meta that mimics the pipeline Twitter entered as it became X.”
Ferguson highlights the hypocrisy of countries like the U.S., ignoring the ways Meta and X “weaponise extremism for commercial gain.” However, they position foreign-owned platforms like TikTok as security threats.
Sharma says her use of X (formerly Twitter) decreased significantly after X took away fact checking in 2022.
“What was once a space for political debate is now a cesspit of toxicity.”
She also says engagement has dropped, with tweets and shares in the 100s and now falling to 50. Even on Facebook, Sharma says she’s noticed an increase in racist content.
“It’s a report-and-rinse cycle,” Sharma says. “You report racist posts, and they come back with, ‘This doesn’t violate community guidelines’.”
Ferguson notes, “Hate speech is not free speech. We need regulation of extremist content to make social media safer for everyone.”
But how do we make social media safer?
Notley compares social media to “any commercial environment, like a shopping centre or nightclub. It needs rules and security.”
“It is unacceptable to say people can do and say whatever they want.”
Free speech is what is important to the public, and efforts to silence or deprioritise voices are a form of censorship.
Ferguson says a safe space means combating hate speech, empowering marginalised voices, and holding platforms accountable for their role in spreading misinformation. But safety isn’t just about platform policies; it’s also about individual behaviour.
In 2024, Ferguson moved to Substack, which she says was partly driven “by the instability of these platforms and the unpredictability of their algorithms when it comes to political content.”
Sharma also said that she now gets her news through sources on different independent platforms that she can trust, and that ultimately align with her values.
But how do we know if a source is trustworthy?
Notley and Hannah both advise individuals to ask themselves a few key questions.
“Ask yourself: Why am I here? What value does this platform provide? Am I using it for education and connection, or am I caught in cycles of outrage and negativity?” Ferguson says.
Notley says that “lateral reading” means never staying within one source, but reading across multiple sources, which is essential before we accept something as fact. Hate speech and misinformation thrive on emotional reactions and division. By interrogating the sources and motivations behind content, we can resist being manipulated.”
In January 2024, it was reported that approximately 3 in 4 Australians use social media. Knowing this, we need to accept that social media is a significant part of our lives. It is not about banning it or removing barriers in the name of “free speech.” It is about regulating content and educating young people on media literacy.
Top photo source: Canva