YouTube algorithms exposing young men to the ‘manosphere’: Reset Australia

Lobby group Reset Australia and the Institute for Strategic Dialogue have published research indicating that YouTube’s algorithm appears to be actively serving young men with videos containing hateful and misogynistic attitudes toward women that have the potential to lead young men into the ‘manosphere’.

According to Reset Australia, the term ”manosphere’ describes “a loose collection of movements” such as ‘incels’, Men Going Their Own Way (MGTOW) and men’s rights activitsts (MRA), that are “marked by their overt and extreme misogyny”. These groups are primarily situated online and herald an explicit rejection of feminism, with terminology and tactics that often overlap with other far-right extremist groups. 

Screenshot of side-bar recommendations to the blank account for ‘Sigma male’ content in the Reset Australia report.

The report’s methodology involved short-term qualitative research analysing the algorithmic recommendations and trajectories provided to 10 experimental YouTube accounts. These accounts included:


  • Four boys under 18 who followed content at different points along the ideological spectrum, from more mainstream to extreme sources and influences.
  • Four young men over 18 who followed content at different points along the ideological spectrum, from more mainstream to extreme sources and influences.
  • Two blank control accounts that did not deliberately seek out or engage with any particular content, but instead followed the videos offered by YouTube’s recommendations.

The report uncovered that YouTube’s algorithms, and the new YouTube shorts feature in particular, served anti-women content to every account unprompted that became more extreme over time time, and in some cases recommended incel, neo-nazi and white supremacist content.

In addition, Reset Australia commissioned polling by YouGov of 500 Australians aged 16 to 17 years old and found that 46% of those identifying as White had been exposed to these narratives online compared to 29% of those identifying as Black, Brown, Asian or coming from other Non-White backgrounds, and 19% of those identifying as Aboriginal or Torres Strait Islander background.

Reset Australia’s board director and executive director of the Gradient Institute, Dr Catriona Wallace, said: “The results of this study are shocking, disturbing and alarming. And, with Web3 and platforms such as the Metaverse just around the corner, the threats to women and the vulnerable are only going to get worse, very quickly.”

“This is yet another example of the business model of Big Tech placing profit ahead of safety – women’s safety. We need to urgently put in place regulation that forces social media platforms to be transparent about the risks of their algorithms and redesign how they promote content so that they align with the feminist futures we want to create.”

Commenting on Reset Australia’s findings, a YouTube spokesperson said: “We have Community Guidelines in place to remove hate speech, harassment, violent or graphic content, and certain types of misinformation. In addition to content policies, we use recommendations and search results to point people to the most authoritative information available, while reducing the spread of content that is borderline. Today, consumption of borderline content that comes from our recommendations is significantly below 1%.”

According to YouTube some published papers by independent researchers looking into the impact of tech platforms on the consumption of this kind of content have suggested that YouTube’s recommendations aren’t actually steering viewers towards extremist content but that consumption more generally reflects personal preferences. Here the platform pointed to a study from researchers at Harvard and the University of Pennsylvania that found no evidence that “echo chambers” are caused by YouTube recommendations, as well as an Anti-Defamation League (ADL) report that found that YouTube’s recommendation algorithm discouraged radical content by favouring mainstream media and cable news content over independent YouTube channels.

With gender equity set to be a key election battleground, Reset Australia has proposed that Australia needs to reconsider the regulatory framework governing digital platforms to ensure that systemic, community risks such as those posed by YouTube’s algorithm are adequately addressed.

Some of the recommendations made by the lobby group on this front include:

  • The expansion of the definition of ‘online harms’ to address gender based violence and community threats, as opposed to individual harms.
  • Regulation of systems and processes, nit just content moderation.
  • Platform accountability and transparency, including algorithmic auditing or data access for researchers and regulators to assess the effects of platform systems.
  • Enforced government regulation.

Hunter Johnson, CEO of The Man Cave, an emotional intelligence charity that has worked with 25,000 young men across Australia, added:

“While alarming, the findings of the study are not surprising. Through our work with thousands of young men on the frontline we’re seeing that, particularly after two years of the pandemic, they are feeling socially isolated, confused and disengaged. It’s imperative that we invest in preventative initiatives to positively re-engage boys in the physical and digital worlds that they inhabit, otherwise we risk the further radicalisation of their belief systems.”

The full Reset Australia report, Algorithms as a weapon against women: How YouTube lures boys and young men into the ‘Manosphere’ can be viewed here.

Source link

Leave a Reply

Your email address will not be published.