The free speech Xperiment

In January 2025, the European Union demanded to see internal documentation about X’s “recommender system”, which makes content suggestions to users. This was part of an investigation into whether Elon Musk’s X had breached the EU’s DSA, which governs content moderation and information manipulation. 

There’s a general understanding that digital platforms wield immense power over public discourse, and there are growing concerns about whether X’s current content moderation has tipped the scales in favor of certain ideologies online.

Accusations about platform bias, particularly toward specific political groups, keep surfacing and warrant serious examination. But here’s an interesting paradox, X’s policies may have also amplified various progressive movements, such as those advocating for Palestine and Ukraine. Despite a call by many digital rights campaigners to exit X to other platforms such as Bluesky, Musk’s platform continues to play an important role for many – not just the fast right and the political extreme. The concept of content moderation is complex, and its selective enforcement raises more questions than it answers.

Since Elon Musk took over X in 2022, the platform has aggressively rebranded itself as a haven for free speech. But free speech, as we’ve seen, often comes with consequences. The decision to scale back content moderation, reinstate previously banned accounts, and reduce reliance on traditional fact-checking mechanisms has led to an explosion of misinformation and extremist voices. And there seems to be plenty of evidence that suggests X has become a breeding ground for far-right narratives to thrive.

But state actors are also guilty of pushing narratives to influence public opinion and behavior.  During the peak of the COVID-19 pandemic in the Philippines, a covert campaign disseminating misinformation about Chinese vaccines was carried out on X. This operation, linked to approximately 150 accounts, was exposed through a Reuters investigation in June 2024. Musk and his engineers may argue that the Pentagon-driven information operation was before their time. Nevertheless, it demonstrates that the platform can be manipulated or gamed to foreground certain types of content over others. 

The role of X in our information ecosystem cannot be analysed in black and white terms. After October 7th attacks in Israel, X removed hundreds of accounts it claimed were linked to Hamas, and also censored Palestinian journalists and activists. However, hashtags like #FreePalestine and #CeasefireNOW are popular and attract audiences at the time when traditional mainstream media seems to be limiting airtime to pro-Palestinian voices. Similarly, Ukrainian digital activism has traction on X, with viral threads and grassroots fundraising efforts reaching large audiences. 

Nevertheless, there is a growing body of evidence that suggests that the algorithms that push content at X and other social media platforms can favour one political ideology over another. While X claims to be a space for free speech, it clearly isn’t and its algorithms are influencing what we see and read. Whether it’s Musk manipulating his platform directly or by sharing hateful posts with his 200 million plus audience, or governments, corporations, and other influencers gaming X to their advantage – there is a clear problem when it comes to the ways in which information – dangerous or not – travels through the platform to audiences. So,  algorithmic transparency matters – especially if the platform claims there’s no manipulation. 

The EU’s demand for algorithmic transparency is a step in the right direction. Having a strong force calling out political bias on platforms is critical – especially when a vast majority of the population gets its news from social media – is critical and may push platforms to be more accountable. But we also need to acknowledge that what we see on platforms are determined by a variety of forces. The flow of information is in the hands of anyone who have the power to manipulate and game the system.

The writing on this page, or linked from this page does not necessarily represent the views of or research by Common Edge.