By Joe Langabeer

Lately, I’ve been enjoying the podcast Tech Won’t Save Us, which is hosted by Paris Marx. Marx has been a critical technology journalist for the past nine years and has also authored the book, Road to Nowhere, which delves into Silicon Valley’s idealistic views about the transportation sector, highlighting how it often disregards the actual material reality of building transportation.

The podcast covers a range of interesting topics, including the risks associated with artificial intelligence, the right-wing control on major online retailers like Shopify, and criticisms of Silicon Valley and its right-wing ideological influence on US policy.

In the past, Marx has strongly criticised the billionaire tech tycoon and owner of X, Elon Musk, who has been enabling far-right ideologies by re-tweeting misinformation in the recent UK riots. Marx argues that governments should be cautious about supporting Musk’s “vision” for the future, suggesting that the billionaire’s plans for supposedly “advancing humanity” are driven solely by his own self-interest.

Companies like Space-X are not self-sustaining

I have previously voiced my criticism of Musk as a figure seen as a promoter of human development and innovation, such as in this previous article in Left Horizons, about his company, Space-X. The way that I and Marx may have differed in the past is in questioning how influential Elon Musk is in the public space and whether his companies would be able to outdo the established capitalist institutions that have long dominated the economy.

I still hold my opinion that, despite all the hyperbole that Musk brings to his narcissistic and incoherent rants on Twitter, or in interviews, he will never be able to truly dominate the economy, as so many of his projects rely upon government subsidies. Companies like Space-X and Tesla are ultimately not self-sustainable. Although Musk would never publicly acknowledge it, it is the government which has made him the second-richest man in the world, not his malfunctioning luxury electric cars or his flawed space rockets.

In a recent episode of his podcast entitled, Is Social Media Fueling Far-Right Riots?, Marx discussed with Hussein Kesvani the role of social media in the recent UK attacks by the far-right on asylum-seekers’ hotels. Kesvani, a journalist who has been critical of technology in his podcast, Trashfuture, was interviewed by Marx.

At around the 20-minute mark in the interview, Marx asks Kesvani whether social media has predominantly fueled hatred towards refugees and migrants, or if other media outlets have been contributing to this narrative of hatred prior to the events. Both Marx and Kesvani concur that while social media has played a crucial role in organising far-right riots—thanks to platforms like Telegram, which allow users to remain anonymous, and X, where agitators can openly express racist views, this phenomenon did not emerge in isolation.

Groundwork for bigotry and hate

To take a current example of how X fosters a right wing agenda here in the UK, the BBC reports that the wife of a Tory councillor was convicted in a British court of stirring up racial hatred on X, calling among other things, for hotels housing aslyum-seekers to be “set on fire”. Yet, astonishingly, she was found not to have violated the ‘rules’ of the social media platform.

But the groundwork for such bigotry and hate has long been laid by right-wing mainstream media, which has been stoking such sentiments for decades. Notably, their focus on Islamophobia intensified in the aftermath of 9/11, driven by the hysteria surrounding “Muslim terror” and the subsequent invasion of Iraq.

This longstanding media narrative has created a fertile environment for far-right ideologies to flourish, both online and in the streets. Cast your minds back to 2015, when the Sun printed a blatantly racist headline (left), claiming that nearly 1 in 5 British Muslims had sympathy with the terrorist organisation ISIS.

Incidentally, the regulator Ipso suggested the headline was “misleading”, and The Sun conveniently took the article down, but not after millions of copies had been distributed and sold. This example demonstrates not only the blatant racism of the right-wing media in the UK, but also the failings of the regulatory bodies that are supposed to hold the media to account.

“Misleading” was not a word to describe that headline. It was a false, racist headline that should have seen The Sun be shut down for its callous words. The Sun, like other red-top newspapers, has a history of reporting misinformation, such as the false information they published about the Hillsborough disaster. Regulatory media bodies have allowed media outlets to spread lies and misinformation for far too long, providing a basis for the growth of right-wing groups.

When it comes to the circulation of right-wing media propaganda on social platforms, it is frustrating that some on the left claim that traditional printed media is declining in influence, while independent online media is thriving. While it might true that print media circulation is declining, the same right-wing media outlets have successfully distributed their articles through social media advertising and promoted their content.

Advertisements by the tabloids on social media

The Sun website, for example, reached over 30 mn hits a month and had nearly nine million daily readers in 2023. A significant portion of social media traffic comes from advertisements by The Sun and other tabloids, through posts that are promoted by the algorithms of the social media platforms.

Interestingly, the Sun stopped publishing its yearly print figures in 2020, possibly due to declining readership of the paper version. But it is still important to acknowledge that the right-wing media has firmly established itself as an important part the landscape of social media, thanks to the promotion through platform algorithms.

An algorithm is simply a set of ‘rules’ that are written into a programme, to be followed in calculations and other problem-solving operations, and they are common in IT and computer functioning in general. But social media companies also use algorithms, in their cases to determine how an individual user is provided with content.

If, for example, a user ‘engages’ with a particular Facebook post, or a linked article, or YouTube video, then the established algorithms will take into account how the user engaged with that content and then promote more posts or videos based on the same kind of content. If you ‘like’ a racist or homophobic post on Facebook, you will see a lot more to ‘like’, and so your views, if they are already skewed, are reinforced.

Threads, the Facebook-owned rival to X

If a user watches a YouTube video in which a person reviews a video game, the algorithm should suggest other similar videos for to watch. However, the reality is often different. In my personal experience, I have been an avid YouTube video watcher since I was a teenager, and I watch content on left-wing politics, video games, and film reviews. If I watch a video from a left-wing YouTube channel, I am immediately recommended a video from a right-wing news channel. The same happens when I watch video games and film reviews. It feels as if the algorithm is attempting to influence me towards more right-wing thinking.

The Institute for Strategic Dialogue, a research company based in London, recently conducted a study, in which they created mock accounts and focused on four interesting topic areas on YouTube: gaming, male lifestyle gurus, mommy videos discussing lifestyle, and Spanish-language videos. All of the accounts promoted right-wing videos and Christian videos, including content from far-right influencers like Jordan Peterson and the obnoxious misogynist, Andrew Tate.

If an algorithm is profitable it will continue to function

Both Peterson and Tate have been spreading conspiracy theories about the far-right riots in the UK. Professor Jessie Daniels, the head of the study, suggested that Google, the owner of YouTube, was more interested in continued engagement and making a profit by allowing extremist right-wing videos to remain, rather than taking the videos down or changing the algorithm to provide a balanced viewing experience on the platform.

According to Professor Daniels, if the algorithm is profitable, the company will continue to prioritise it, even if it means exposing children and young people to racist and far right views.

The inability to confirm whether the algorithm promotes right-wing content is primarily due to tech companies not being prepared to release statistics or data on the issue. The creation, structure and monitoring of algorithms are kept well hidden from the public eye.

Ironically, the most open platform to discuss changes in the algorithm is Elon Musk’s platform, X. His own tweets were not getting the attention he wanted, so he changed the algorithm so that all users could see his tweets. In many ways, Musk represents the very problem of social media and algorithms. While he was always a defender of capitalism, his shift towards the right is a sign that the right-wing promotion on platforms like X and other social media platforms can and has even radicalised its owner.

In response to Musk’s public support of conspiracies and far-right figures like as Stephen Yaxley-Lennon (AKA Tommy Robinson), some Labour MPs, journalists and other political figures are starting to leave X and are moving to platforms like Threads.

Threads is a platform similar to X, but run by Meta, which owns Facebook. Its creator, Mark Zuckerberg, previously said that he wouldn’t endorse a presidential candidate in the upcoming US election, but then he praised Trump after the failed assassination attempt.  Trump raising his fist after he was shot was, Zuckerberg said, “one of the most badass things I’ve seen in my life.”

Facebook has also faced numerous issues with far-right and conspiracy content. I even had posts by Holocaust-denial groups recommended to me back in 2020, and this led me to leave Facebook temporarily. Zuckerberg originally said that he would allow such content on the platform, before reversing his decision after a backlash.

Whether it is X, Facebook, Threads or any other, it is completely misguided to imagine that any of these platforms, or the people who run them, are fundamentally any different from one other.

Like a lot on the left, I have been grappled with whether or not to use social media. During the pandemic, I deleted my social media accounts for a period of time, only to eventually return, because they became the primary means of staying connected with friends and family.

However, I am aware of the negative aspects of these platforms, such as the spread of conspiracy theories and far-right ideologies, as well as their potential to cause feelings of isolation, and promote body dysmorphia, and mental health problems. Although there has been some preliminary research on this, the findings have been largely circumstantial. This topic has been discussed insightfully by Michael Hobbes and Peter Shamshiri in their podcast If Books Could Kill, to which I highly recommend subscribing.

Social media has the potential to be of great benefit

As social media has become ingrained in our daily lives and social interactions, the question arises, how do we change it? From the point of view of the technology involved, it is potentially an enormously beneficial and positive development, an integral part of human society and communication. But only if the platforms are taken out of the hands of profiteers and sharks and they become public enterprises – working to provide a service, not a profit.

In such a scenario, all of the algorithms could be made public rather than be hidden from view. In so far as ‘regulation’ is necessary, this could be done by a democratically constituted body, assisted by IT experts, and with real powers, to manage and monitor the use of algorithms. Such a regulatory authority is one that users would actually trust.

In this situation, all political parties, campaign groups, charities, and ‘influencers’ in fashion, news, ideas, science, art and literature would be able to promote content. In a situation where social media platforms were publicly-owned and not slaves to profits, it would be the users who ultimately ‘moderated’ content, rather than some multi-billionaire who has a personal axe to grind and a personal agenda he wants us all to follow.

[All pictures included, except Sun headline, are from Wikimedia Commons, eg here]

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Instagram
RSS