Tate, TikTok and Toxic Masculinity; Is social media to blame for this generations Violence Against Women?
Abstract
Content Warning: This research contains graphic language and themes and descriptions of; sexual assault, sexual violence, rape, physical violence, incest, paedophilia and misogyny.
The research into online misogynistic content and behaviours showed that there is a widespread issue of misogynistic content being perpetuated in user-generated content on social media. Research was carried out using a mixed methodologies approach using over 1000 gathered pieces of social media data and quantitative and qualitative data as part of a survey with 121 participants of mixed gender and ages. The social media data analysis showed the prevalence of misogynistic content being shared online in user interactions many of it containing themes of; rape, incest, paedophilia, sexual violence, physical violence and generalised misogyny. Survey results showed that across all platforms; 94.2% of participants had experienced content on social media that they would classify as misogynistic, 97.5% had seen sexist jokes or memes, 76.9% had seen harassment or threatening behaviour towards women and girls online, 98.3% had seen content stereotyping women, 97.5% had seen objectifying content about women and 60.3% had seen sexually violent content. Research also showed that 81.9% of participants were concerned that misogynistic content online is influencing younger people to make the same kind of content on social media. Qualitative data also reported that participants were most concerned about the influence that online influencers like Andrew Tate may have on young boys. The data clearly showed connections between misogynistic attitudes and user-generated content online being alarmingly prevalent and widespread, being influenced by incel cultures, misogynistic attitudes and consumption of violent pornography. A combination of intersectional measures are recommended; comprehensive consent, sex and relationship education should be implemented for all children along with education on the dangers of harmful behaviours and misogynistic attitudes. More legislation of social media platforms on the kind of content and language used should be considered as well as legislation to better regulate the pornography industry and the accessibility of its content to limit children’s consumption of violent and degrading content. Additionally, there should be more support for victims of online misogyny in the aftermath of exposure.
Author: Georgia Lucas, BA (Hons) Criminology and Sociology, 2024
Table of Contents
- Introduction
- Literature Review
- Methodology
- Results and Findings
- Discussion
- Reference List
Chapter 1: Introduction
Social media has changed the way people communicate, share information, and form communities. Platforms like TikTok, with it’s wide-spread reach, have become significant in the role of public opinion and behaviour. However, alongside the benefits of connectivity and creative content, there has been a rise in concern regarding user-generated content that perpetuates harmful ideologies, including misogyny and violent fantasies. This dissertation explores the critical issue of whether social media, particularly TikTok, has an impact on the issue of violence against women by examining the prevalence and impact of misogynistic content online.
Violence against women and girls (VAWG), is a critical issue in the UK, affecting women across all demographics and socio-economic backgrounds. This prevalent problem encompasses a range of abuses, including domestic violence, sexual assault, FGM, and coercive control, (Ellsberg et al., 2015). According to recent statistics, approximately 97% of women and girls in the UK have experienced sexual assault or harassment in public spaces, (UN Women, 2021). The impact of VAWG goes beyond immediate physical harm, leading to long-term psychological trauma, social isolation, and economic instability for victims, (Chrisler & Ferguson, 2006). In recent years, there has been growing recognition of the systemic nature of VAWG, influenced by deeply engrained gender inequalities and societal norms that condone or overlook abusive behaviours. Efforts to address this issue have included legislative reforms, such as the Domestic Abuse Act 2021, and the implementation of strategies to improve support services and protection for victims, (Domestic abuse act 2021. 2021). However, the rise of digital platforms has introduced new challenges, with online harassment and abuse becoming increasingly prevalent. As such, addressing VAWG in the UK and globally, requires a comprehensive and intersectional approach, including both offline and online, ensuring comprehensive protections and support for all women and girls.
To understand the scope and impact of online misogyny, this dissertation utilises both quantitative and qualitative survey data as well as social media data. The social media data examines themes within online user interactions and trends in content generated by users, which is supported by survey data examining the prevalence of misogynistic content online and participant’s opinions and concerns of the impact of online misogynistic behaviours. This dissertation seeks to answer a pressing question: Is social media to blame for this generation’s violence against women? By investigating the link between user-generated misogynistic content and real-world behaviours, this research aims to contribute to the broader discourse on social media’s role in shaping societal norms and individual conduct. The findings show the urgent need for an intersectional approach to addressing online misogyny, involving stricter platform regulations, enhanced public awareness, and proactive educational efforts.
1.1 Research Questions and Aims
The following research questions and aims of this study were developed to assist in carrying out comprehensive research to help fully understand the issue of online misogyny in this dissertation:
- To what extent does the content on TikTok contribute to the normalisation of violence against women, and how is this reflected in user engagement and interactions?
- How do gender-based stereotypes and biases manifest in user-generated content on TikTok, and how do they contribute to the perpetuation of violence against women on the platform?
Chapter 2: Literature Review
2.1 Pornography and Problematic Sexual Behaviours In Children
The impact that pornography has on relationships, mental health and the ethical implications is a long-contested subject. With the widespread accessibility of social media platforms and digital media, children’s access to explicit content is easier than ever. Concerns have escalated regarding the potential impact of pornography on children’s body image, views on relationships and behaviours towards others, especially women and girls. The definition of what counts as pornography differs societally and culturally. However most agree that the phrase pornography may refer to the sexually explicit material, including images, videos, literature and auditory content created with the primary purpose of arousing sexual desire in those consuming it, (McKee et al., 2020). More specific and relevant definitions may vary depending on cultural and philosophical opinions.
The report by the Children’s Commissioner for England highlighted that the average age of exposure to pornographic content online was 13 years old, (both girls and boys), with 28% of participants in a survey conducted the previous year of 16-21 year olds intentionally accessing online pornography ‘every day or more often’. The male participants response to this question greatly outweighed that of their female counterparts, with 21% of the 28% being male and 7% being female, (Children’s Commissioner, 2023). Additionally, while popular traditional pornography websites such as Pornhub receive billions of views annually, (upwards of 40 billion views), Twitter was the most popular platform for accessing pornographic content in the surveyed group of 16-21 year olds, (Children’s Commissioner, 2023) (Neufeld, 2021). The Children’s Commissioner’s report on harmful sexual activity and experiences also highlights the importance of the theoretical framework of ‘sexual script theory’, when examining how children’s access to and viewing of online pornography may influence problematic behaviours and views.
Sexual script theory is a sociocultural framework used to understand and analyse human sexual behaviour. Developed by sociologists John Gagnon and William Simon, sexual script theory, theorises that individuals develop mental frameworks or ‘scripts’ that guide their sexual interactions and behaviours, (Simon & Gagnon, 2003). These scripts are shaped by societal norms, cultural values and personal experiences and they may influence individual perceptions, expectations and considered norms in sexual situations, this includes the viewing of online pornography. Because of the influence of societal norms and learnt behaviours, different genders may experience universal differences in sexual scripts, such as sexual communication differences, (Machette & Montgomery-Vestecka, 2023). According to sexual script theory, sexual scripts include a sequence of events, roles and behaviours that individuals are expected to follow in various sexual encounters. This becomes increasingly problematic when large amounts of the pornographic content that children are viewing include acts of physical aggression, humiliation and acts of coercion. In an analyse of Achieving Best Evidence (ABE) transcripts of police interviews with children who have been harmed or who have harmed other children, 50% of ABE interviews recorded at least one instance of physical aggression that is commonly seen in pornography; this may be acts such as strangulation, slapping and sexually violent name calling, (‘’slut’’, ‘’whore’’, ‘’bitch’’) to acts like image based abuse, ejaculation on face and being abused whilst drugged, (Children’s Commissioner, 2023). This can be linked back to and as a direct influence of children’s sexual scripts through the analysis of frequent themes in the titles of pornographic content. A study conducted by VeraGray et al., (2021), found that for first time visitors to mainstream pornographic websites, every 1 in 8 content titles included references or themes that could be classed as sexually violent. Some of the most frequent titles with sexually violent references included themes of incest, explicit sexual aggression or assault, image based abuse such as ‘spy cams’ and coercion, (Vera-Gray et al., 2021). This phenomenon raises concerns due to evidence that there is a six times increased probability of an individual committing an act of sexual violence or aggression if they have previously sought out pornography that includes depictions of sexual violence, (Ybarra et al., 2011). Additionally, this may further concern researchers due to the implications of violence against women and girls due to research suggesting between 91% and 97% of children who display some form of violent or sexually aggressive behaviour being males, (Taylor, 2010), (Vizard et al., 2007), (Hackett, 2014).
While Ybarra et al., (2011), is a good indicator of the impact that violent pornographic content has on children’s development and likelihood of future criminal and violent behaviour, it is important to assess the limitations of the research in order to have a well-rounded understanding of the topic. While the researcher put measures into place to maximise the level of honesty when reporting, there may have been a level of socially desirable reporting. Due to the perceived morality of the subject matter, participant may have responded to the study in a way that better aligns with socially acceptable behaviour and views rather than providing an accurate and honest account of behaviour due to pressure, perceived or not, to present themselves in a more favourable light. Additionally, there may have been pressure on participants when reporting, to conform to give a more socially acceptable answer if there was external influence by parents or carers in the room with the participant while filling out their report sheet. Furthermore, due to the self-reporting style of the methodology, participants needed to have a good level of self-awareness of their behaviours and actions, something made more difficult by the young age of participants, (10-15 years old). Self-awareness enables participants to reflect on their thoughts and behaviours leading to a more accurate portrayal of their experiences and proper education on what counts as sexually violent behaviour. Despite this, Ybarra et al., (2011) conducted an impressive study into the implications of children viewing pornography that can fill in some gaps in the literature surrounding social media and online spaces perpetuation of violence against women and girls.
2.2 Artificial Intelligence (AI) and Deepfake Pornography
In recent years, advancements in artificial intelligence have given rise to a concerning phenomenon known as deepfake technology. Smith & Mansted, (2020) described deepfake technology as a type of artificial intelligence that uses deep learning algorithms that involves manipulating audio, video and/or images to depict individuals doing or saying anything that the creator wishes. Until recently, the societal understanding of what a deepfake was, were funny videos of celebrities and politicians. For example, filmmaker and director Jordan Peele created a deepfake video of former US President Barack Obama in which the video depicts President Obama calling former (at the time current) President Donald Trump, ‘’a total and complete dipshit’’, (Vincent, 2018). However, artificially created deepfakes existed long before then. Now more than ever, academics and activists are alarmed at the rise in the misuse of deepfakes and deep learning technology, in which online users are producing non-consensual artificially manipulated pornography. Despite not being the socially understood intentions, this use of artificial intelligence was in fact, the first recorded use of deepfake technology dating back to 2017, when users on the social media platform Reddit, digitally replaced the faces of women in pornography videos with female celebrities including, Taylor Swift and Maisie Williams. It wasn’t until journalist Samatha Cole wrote an article entitled, ‘We Are Truly Fucked: Everyone Is Making AI-Generated Fake Porn Now’ for Vice, that individuals realised deepfake technology was being used in more ways than one. Deepfake pornography began to be shared by users on Reddit, teaching each other how to swap the faces of female celebrities onto any video they wanted (Cole, 2018).
The psychological impact on victims is severe, often resulting in anxiety, depression, and trauma, additionally, the invasion of privacy and the loss of control over one’s image can lead to feelings of helplessness and violation, (Faragó, 2019). This type of digital abuse frequently targets women, exacerbating gender-based violence and reinforcing harmful power dynamics, (Wagner & Blewer, 2019). The reputational damage caused by deepfake pornography can be devastating. Individuals may find their personal and professional lives irreparably harmed. Careers can be ruined, relationships destroyed, and social standing tarnished. The stigmatisation associated with being depicted in explicit content, even when fabricated, can lead to social exclusion and intense personal humiliation, (Short et al., 2017).
From a legal standpoint, deepfake pornography presents significant challenges. Existing laws often struggle to address the complexities of digital impersonation and non-consensual content creation, (Meskys et al., 2020). This legal grey area leaves victims with limited options for justice, adding onto their distress. The lack of robust legal protections shows the need for updated legislation that can effectively combat the misuse of deep fake technology. Moreover, the widespread availability and potential misuse of deepfake technology undermine trust in digital media. As deepfakes become more sophisticated and harder to detect, they erode confidence in the authenticity of visual content. This not only affects individual victims but also poses broader societal risks. The ability to manipulate images and videos easily can be weaponised for disinformation campaigns, cyberbullying, and blackmail.
2.3 Incels, The Manosphere and Violent Extremism
An ‘invcel’ is a term that originated as shorthand to the term ‘involuntary celibate’ (later, ‘incel’). It refers to men who hold anger and frustration towards women as a whole gender, due to their own perceived inability to engage in sexual or romantic relationships, (Bates, 2020). Incels will typically reduce their ‘misfortunes’ to a combination of factors. This may be their physical appearance, social class and status and societal norms and expectations. However, ‘incel’ didn’t always insinuate men who hold misogynistic beliefs, originally ‘invcels’ were a small community of people that began in 1997 by a woman known only by Alana. Alana took to the internet to share her experiences dating creating, ‘Alana’s Involuntary Celibacy Project’, (Sparks et al., 2022). Eventually, Alana detached from the community and didn’t think any more about it until years later, when the media picked up on the rising concern of the violent misogyny associated with ‘incels’. When interviewed about her former community, Alana expressed her concern at the current implications of terminology she created, “It feels like being the scientist who figured out nuclear fission and then discovers it’s being used as a weapon for war’’, (Kassam, 2018). Gone were the days of Alana’s ‘’friendly community’’, now thousands of incel forums are active across various social media platforms running rampant with violent misogyny and extreme far-right views, (Taylor, 2018). The ‘manosphere’ refers to a loosely connected network of online communities, forums and social media groups that encompass sub-sections like incel communities, pick-up artists and Men Going Their Own Way, (MGTOW), all of which promote toxic and harmful behaviour towards women, (Lin, 2017).
In 2020, feminist activist, popular author and founder of the Everyday Sexism Project, Laura Bates, released her book, ‘Men Who Hate Women’ about her year-long undercover experiment into online communities of incels, the manosphere and Pick up Artists. Bates’ research uncovered the extremism stemming from the massive online forums and websites, the most popular of which, boasts close to 9,000 members, between 2-3 million individual posts and over 350,000 threads, (Bates, 2020). Threads are an easy way for users to engage in community discourse, but on many incel forums they are used as a way of promoting violent misogyny, rape fantasies and extremist pipeline ideology. Threads that have appeared on incel forums include, ‘’Why I support the legalisation of rape’’. Another thread included a video of a man beating a woman with no audio. This received responses such as, ‘’I want to hear her screams’’ and another stating that he wants to ‘’rape a bitch just so I can get a 10 page thread on here’’. Another thread called, ‘’Should women be considered human?’’ received an overwhelmingly conclusive result of ‘no’ with one user replying to it, ‘’I hardly consider them an alive body let alone entitled to having human rights’’, (Bates, 2020).
Modern incel communities have developed specific terminologies, (Pelzer et al., 2021). For example, ‘femoid’ or foid’short for ‘female humanoid’, dehumanises women, reflecting the intense misogyny prevalent in these groups, (Baele et al., 2024). Key concepts like ‘red pill’ and ‘black pill’ have also emerged, coming from the 1999 movie ‘The Matrix.’Taking the ‘red pill’ symbolises awakening to the reality where societal structures and norms favour women at the expense of men, (Kelly, 2020:2). The ‘black pill’ represents a more extreme, nihilistic belief that these societal and biological disadvantages are beyond all hope, leading to a sense of hopelessness, (Lindsay, 2022). Followers of the black pill ideology often believe that their situation is fixed, developing deeper despair and sometimes inciting harmful attitudes and behaviours towards women. These communities and their ideologies perpetuate toxic narratives that exacerbate feelings of alienation and hostility, contributing to harmful online discourse and occasionally manifesting in real-world violence, (Speckhard et al., 2021).
On the 23rd May 2014, 22 year old Elliot Rodger killed 6 people and injured 13 more in Santa Barbra, USA. In his memorialised YouTube videos including one titled, ‘Elliot Rodger’s Retribution’, Rodgers explains that due to ‘hot girls’ in Santa Barbra rejecting him, he wants to ‘’punish [them] all for it’’, (Kelly, 2020:1). In addition to posting videos detailing incel ideology, Rodgers also wrote a 137 page manifesto, which he printed out prior to the murders, showcasing the months of planning and emersion in ‘the manosphere’ and incel communities, (Dewey, 2021), (Hill, 2014), (Woolf, 2015). While the most well-known case attributed to inceldom, the mass shooting in Santa Barbra is not an isolated incident and since then many other cases of shootings, sprees and murder have been attributed to incel ideologies, (ABCNews, 2009), (Saul, 2015), (Johnson et al., 2015), (Tierney, 2018), (Burch & Mazzei, 2018), (Fonrouge & Brown, 2018), (Cecco, 2019), (Donovan-Smith, 2019).
Other sub-sections of misogynistic ideologies such as Pick-Up Artists, function in a similar way to that of incel communities. All existing within ‘the manosphere’, have their own terminology used to communicate with other members of the community and exhibit extreme levels of entitlement over women and women’s bodies. Many believe that it is their, ‘God-given right’ to sex with women, (Bates, 2020).
Manosphere ideology is not, however, only limited to Reddit. The rise of influencers like Andrew Tate highlights a trend online, where individuals with controversial views gain significant followings and gain substantial influence. Tate, known for promoting toxic masculine ideals and misogynistic rhetoric, used platforms like TikTok to reach a broad audience, particularly young men, (Leeming, 2023). His content, which often glorifies dominance over women, is amplified on these platforms due to its provocative nature, leading to widespread distribution, (Rhymes, 2023). The impact of promoting misogynistic messages on mainstream platforms like TikTok is deeply concerning. Firstly, the algorithm-driven nature of these platforms means that content which generates high engagement, often through controversy, is prioritised, (Boucher, 2022). As a result, Tate’s messages repeatedly appear in the feeds of impressionable young users.
Chapter 3: Methodology
3.1 Overview and Justification for a Mixed Methods Approach
The primary research utilised a mixed methods approach, combining social media and survey data. The survey primarily included quantitative questions, with a few open-ended questions for qualitative insights. Research indicates that mixed methods enhance study depth and understanding by offering diverse data forms, providing a comprehensive view of the topic (Venkatesh et al., 2013). This approach allows for a nuanced understanding of complex subjects, balancing the strengths and limitations of each method to improve overall validity (Johnson & Onwuegbuzie, 2004). Including social media data alongside survey data offers a distinct perspective, crucial for analysing how violence against women and girls manifests in user-generated content and behaviours. This real-time content analysis provides an accurate scope of the issue, complementing theoretical frameworks and user experience reports.
3.2 Social Media Data
3.2.1 Overview of method
In research, social media data encompasses the large amount of content created and shared by users on various platforms, including text, photos, videos, links, and user interactions like likes, shares, comments, and messages. This can be analysed to understand online community interactions, attitudes, and behaviours (Kane et al., 2014).
For this study, over 1000 unique pieces of user-generated social media content were collected to analyse trends in language, and publicly shared content. This included comments on posts and videos from the platform TikTok. Importantly, no specific keywords or phrases were targeted during data collection. All data was gathered randomly to maintain the study’s validity and authenticity. This approach aimed to reflect the likelihood of encountering misogynistic content during normal social media use, ensuring an accurate representation of typical user experiences.
3.2.2 Participants
All participants in the social media data collection are genuine users on social media platforms who have posted comments or made videos that are viewable to anyone in the general public. TikTok is the only platform listed above that requires the public to have created a free account to gain access to the content on the app. All other platforms can be viewed and accessed by a member of the general public without an account, (although it is important to note that this is limited access).
In the case of user generated comments left on social media platforms, it is unknown what the identity of the participants are, including the age, gender identity, race, and nationality. However many of the accounts often include profile photos and names that may identify them, however it cannot be assumed that any of said information is accurate. In the case of videos made and uploaded by users, the majority of the time, the individual making the video can be content, providing some insight into their identity.
3.2.3 Ethics
The ethics of using social media data involve considerations of privacy, consent, data ownership, and potential risks. In this study, informed consent was not able to be obtained, instead, data collection relied on users’ acceptance of platform terms and conditions, acknowledging that publicly posted content is accessible to the general public. No private content was included, and all participants were assumed to meet the platforms’ minimum age requirements.
All collected social media data was anonymised with usernames and profile pictures were removed from comments and videos. It was important to use TikTok as a platform for this research as unlike other platforms, you would not be able to search for a comment and find the user’s profile. This ensured participant anonymity.
Data privacy was strictly maintained. Social media data was collected and anonymised using a private, password-protected phone, transferred to a secure university OneDrive account, and immediately deleted from the phone. This was crucial, given the graphic and sensitive nature of the content, including violent misogyny and references to paedophilia. This ensured the privacy and security of all data.
3.2.4 Strengths and limitations
Social media data offers a deep and current understanding of trends on social platforms, making it an increasingly popular tool in social sciences. It provides valuable insights into human behaviour and societal trends on various topics, from public opinion on politics to brand perceptions.
In this research, social media data allowed for large-scale data collection, unlike traditional surveys and interviews with limited participants. It provided a diverse demographic sample, enabling the examination of language trends, user interactions, communication dynamics, and online behaviours globally, overcoming language and physical barriers. This approach resulted in over 1,000 unique pieces of data, achievable within a short timeframe and offering real-time updates on online content trends. Social media’s evolving language makes it essential for tracking current trends, which would not be possible with outdated studies or second-hand information.
However, there are limitations. While social media data covers a broad demographic, it may not be entirely accurate. Certain groups might be overrepresented, and others underrepresented, particularly those with limited internet access. As of January 2024, only 62.3% of the global population had social media access (Statista, 2024), with women and girls significantly underrepresented, especially in South Asia and sub-Saharan Africa, due to digital literacy gaps (Kashyap et al., 2020). Additionally, the lack of informed consent in social media data collection is a key limitation, though minimised by anonymising all data and working under the assumption that participants are willingly posting public content.
3.3 Survey
3.3.1 Overview of method
Combining quantitative and qualitative methods in research surveys enables researchers to gather comprehensive data on a specific topic. Closed-ended questions gather quantitative analysis, while open-ended questions provide qualitative insights, enhancing understanding. Quantitative data offers analysis of statistics, while qualitative data adds depth and context to participants’ experiences and perspectives. Integrating both types of data enhances validity and reliability, leading to robust interpretations of findings.
In 2021, a UK study on male violence towards women and girls was praised for its thorough analysis, finding that 99.7% of participants reported experiencing multiple instances of male violence (Taylor & Shrive, 2021). This study informed the framework for the quantitative questions in the survey, emphasising the importance of specific, detailed questions for recalling experiences. Dr. Jessica Taylor, the lead researcher, highlighted the necessity of detailed questions to facilitate participant recall (Taylor, 2020). The survey on online misogynistic behaviours similarly used specific questions gain better accuracy in answers. Qualitative questions allowed participants to provide nuanced responses, enriching the survey data beyond statistical analysis and supporting the social media data.
3.3.2 Participants and recruitment
Originally hoping to gain between 50-100 participants, the survey into online misogynistic behaviours and content had a total of 121 participants at the end of the intended research period. The survey was open to anyone who spoke English and was over the age of 18 years old. It was also stated that the target demographic for the study, were people who frequently used social media, but is open to all.
The survey was posted on various social media platforms such as Instagram, Twitter and LinkedIn. This was done as it would be able to effectively guarantee the age and social media usage of participants that aligned most closely to the target demographic. Additionally, the survey was posted to a survey sharing platform called SurveyCircle, which was deemed to be the most efficient way of recruiting a larger number of participants for the research as SurveyCircle is a large platform in which users are actively encouraged to participate in other users research in order to gain points, which in turn would move users own surveys up a ranking of studies, making them more accessible for participants to view. 45 of the 121 participants were recruited from social media platforms and the remaining 76 participants were recruited through SurveyCircle.
The most common age range of participants was 18-21 years old, (39.7%) and cis-gender woman was the most frequently reported gender identity of participants, (77.7%).
3.3.3 Ethics
Ethical considerations were crucial for the survey on online misogynistic content and behaviour. Informed consent was prioritised; participants had to read a participant information sheet and consent form and tick a box to give consent before answering any questions. These documents explained the survey’s purpose, risks, benefits, and participant rights. Participants were informed about the sensitive nature of the topics and provided with contact details in case they wished to reach out for information like mental health services and support organisations like Rape Crisis. It was emphasised that participation was voluntary, and participants could withdraw at any time while taking the survey.
The survey was created and analysed using JISC Survey, an approved platform. To ensure confidentiality and anonymity, no identifying questions were asked, such as where participants went to school or worked. Data security was maintained by accessing and storing all data on password-protected devices. A thorough risk assessment was conducted before starting the research, ensuring that all ethical considerations were addressed and upheld to the highest standards.
3.3.4 Strengths and limitations
The mixed methodology approach in this research combines quantitative survey data with qualitative insights, allowing for a comprehensive exploration of online misogynistic behaviours and content. This method provided participants with the opportunity to elaborate on their thoughts, offering a more nuanced understanding of their experiences with online misogyny. The survey included 121 participants, exceeding expectations and enhancing the robustness and generalisability of the findings. Recruiting through multiple platforms, including social media and a survey-sharing website, increased participant diversity. To ensure engagement and accessibility, the survey was designed to be user-friendly, with clear language and contact details for translation or other accessibility needs. Ethical considerations were prioritised, including informed consent, confidentiality, and minimising risks to participants, thereby strengthening the study’s integrity and trustworthiness.
However, the survey had limitations. Reliance on online platforms for recruitment may have introduced sampling bias, as participants were likely from similar social groups and networks, potentially limiting the diversity of perspectives. Self-selection bias might have occurred, with those interested in the topic or affected by it more likely to participate. Additionally, while questions were broken down to capture specific instances of online behaviours, ethical constraints prevented questions about participants’ victimisation experiences, possibly affecting the accuracy of the data. The survey also did not ask about participants’ ethnic, racial or religious backgrounds, limiting the intersectionality of the findings and understanding of how these factors might influence experiences with online misogyny.
Chapter 4: Results and Findings
4.1 Social Media Data on User-Generated Content and Attitudes
Over 1000 individual screenshots of user generated social media content was collected from TikTok as part of the research into misogynistic online content and behaviours by users. Nearly all of the social media data collected include graphic language and themes such as misogyny, sexual assault, rape, incest and paedophilia.
To get around potentially flagging harmful language rules and getting the comment or post taken down, many users will self sensor their language. This may include; instead of saying, ‘rape’ they will say ‘grape’ or use the grape emoji, instead of saying ‘sexual assault’ that will say ‘SA’, instead of saying ‘dead’ or ‘kill’ they will use the skull emoji. Often users will use replace individual letters with symbols, replacing ‘a’ with ‘@’ or ‘i’ with ‘!’. Additionally, there are many internet slang words and phrases that users use, often in the context of hateful or discriminatory comments and posts. These will all be looked at in the following presentation of findings and will be explored in more detail later in the discussion.
In an analysis of themes and content in the social media data collected, general misogynistic language and themes and themes of sexual assault, violence and harassment were the most common. The below table shows the frequency of themes found in online user interactions, content and behaviours. Some data collected may have contained just one or more than one identified theme.
Rape | Incest | Paedophilia | Sexual Violence | Physical Violence | General Misogyny |
305 | 16 | 314 | 392 | 114 | 404 |
Table 1: Themes in social media data
The social media data collected could be broken down into seven main themes; threats or mentions of rape (this includes mentions of coercion and statutory rape), incest, paedophilia (including sexually charged comments left about people who are identifiably under the age of 18 including babies, toddlers and very young children), mentions or threats of sexual assault, violence and harassment, death threats or threats of physical violence towards women, and general misogyny (this includes language that stereotypes women and girls, common misogynistic online phrases and any other content that could be deemed as misogynistic but does not fit into any other category).
4.1.1 Themes of Rape
Data that was identified as containing themes of rape, often included substituting words or letters by users in order for their comments to not be picked up on by harmful and graphic language detectors that many social media platforms use in order to limit hateful speech made by users. The most common example is using the word ‘grape’ or using the grape emoji in place of the word, ‘rape’. These however, do not always work and many users use both the term ‘rape’ and substitutions without the comment being flagged. Below are comments left by social media users publicly on the platform TikTok, many of these comments have thousands of likes from other users.
Comments Left by Social Media Users; Themes of Rape |
‘Im gonna grape you so bad you’ll have to create a new type of feminism’ |
‘join the rape rates sex doll’ |
‘it is all men. I would [grape emoji] you too and that’s normal I am tired of hiding it’ |
‘Lemme [grape emoji] u’ |
‘ofc from all the things u think the rape is worst [straight face emoji]’ |
‘honest question is rape such a bad thing’ |
‘There can’t be rape inside a marriage tho, it makes no sense. Marriage vows ensure lifelong consent’ |
‘Rape is a reproductive strategy for men. It has been throughout all of history and we’re not the only species to do it.’ |
‘Yep, you’re added to the rape list’ |
‘I’m not a cop, but if anyone wants to be treated like this, hmu’ (comment left on a post about a women who was raped by a police officer in jail) |
‘All these woman saying they’d kill themselves, ill just go and retrieve your body no big deal’ (comment left on a post about being the only woman on an island, where woman commented that they would commit suicide in fear of being raped) |
‘The women in my life have left me with scars, im glad your Grapist is free’ |
‘I’m gonna know what shade of pink it is whether you like it or not’ |
‘always wished the female r@pe stats were higher’ |
‘like her ugly ass would get raped’ |
‘One day I’m gonna feel how tight it is whether you like it or not’
Table 2: Comments left on social media including themes of rape 1
The phrase, ‘know what shade of pink it is’, is most commonly seen in user generated content including themes of paedophilia and will be explored in that section. A common post made by users features a blank screen or the user standing in frame with the user encouraging other users to reply in the comments, with their answer for a question similar to, ‘what would you do if you were in school and time froze’ or ‘pov [point of view]: the school shooter leaves you alive in a room full of dead girls’. These posts can garner up to thousands, sometimes hundreds of thousands of likes and comments. Below is a table of comments posted by users on these videos made on TikTok.
Comments Left by Social Media Users; Themes of Rape |
‘sadly I don’t live in America’ |
‘sure ill call the police…but not for the shooter [devil emoji]’ |
‘the boys will be proud of you’ |
‘was always on my mind’ |
‘they already gone it wouldn’t even matter’ |
‘[full name of their teacher]’ |
‘the police arrive staring at you…’im cutting in move over’’ |
‘the one girl that was playing dead [skull emoji]’ |
‘they still warm fr (for real)’ |
‘only the girls in class 5 or 6’ |
‘who gon stop me?’ |
‘1st step: get a school shooter’ |
‘time to shine’ |
‘my school teacher [devil emoji] |
‘always dreamed of it’ |
Table 3: Comments left on social media including themes of rape 2
These comments left by users on similar posts all alluded to or directly say that they fantasise about raping female classmates and teachers, with thousands of other users liking the above comments and replying to them with names of the women and girls they are having these sexually violent thoughts about and explaining them in graphic detail.
Additionally, with the prevalence of artificial intelligence, many users have created and shared sexually graphic AI images of celebrities, random women and even animals. The below image
was originally created and shared by a user on X (formerly Twitter), depicts two bears wearing the jersey’s of two opposing ice hockey teams that were set to play a match just after the image was created. In the image, the bears appear in the middle of an ice hockey rink in front of crowds of people, while the bear who is made to appear female, set with a tutu skirt and handbag, is crying while being made to appear to be engaging in sexual activity. The image has been seen on the creators account by over 1 million users alone. Image 1: Ice Hockey Bears; Twitter
4.1.2 Themes of Incest
There were 16 pieces of social media data collected that contained themes of incest. All users who had made public comments or content with themes of incest appeared to be male and over the age of 18 years old. The table below shows some of the content generated online.
Comments Left by Social Media Users: Themes of Incest |
‘I mean it’s not a big deal if a father wants to be his daughters first body. He just wanted an extra tight fit [smirking emoji] [smirking emoji]’ |
‘if my daughter did this I would get her pregnant’ |
‘me looking at my son/daughter after he just called me daddy’ |
‘nah if they can see my internet history they’ll separate me from my sister [laughing emoji]’ |
Table 4: Comments left on social media including themes of incest
Many of the user generated comments that include themes of incest include sexually charged comments from men towards their daughters or sisters. While its not clear if users have children or siblings, there is still graphic and sexually violent misogynistic language being directed towards women and girls and sexual fantasises about women in their families. Frequently, users sexualise common, harmless and often parental terms, such as ‘daddy’ (as seen above) to weaponise it against potential women and girls they are related to, in order to view them in a sexual manner. Additionally, there were user generated comments alluding to previously making sexual comments about female family members or searching online about incestuous content, posting comments about how if their history online and social media is searched by authorities and external companies, users would be separated from female family members.
4.1.3 Themes of Paedophilia
The theme of paedophilia was among the most common with over 300 user generated content gathered containing text and behaviours that could be considered paedophilic in nature. The majority of the data gathered which contained paedophilic themes, were adults commenting on videos or photos of babies or children 10 years old or under, alluding to wanting to ‘have sex’ with the child, openly fantasising what that would feel like and the colour of their genitals. The below table features a few of the many similar comments made that includes themes of paedophilia.
Comments Left by Social Media Users: Themes of Paedophilia |
‘ik it’s bubble gum pink’ |
‘she a baddie ( the child )’ |
‘under 18 is the only way lol let me stop before the FBI comes for me’ |
‘mad potential’ |
‘im tryna pre order’ |
‘Ngl (not going to lie) I would hit that shit’ |
‘shes an investment’ |
‘I wish I could give her a good pounding to satisfy her day’ |
‘shes fresh’ |
‘bro I want to rent the one in purple’ |
‘why they never look so good when I’m out on the prowl’ |
‘hear me out’ |
‘id love to screw her’ |
‘smash, next question’ |
‘I did say the truth tho u cant deny that, little girls are pretty unlike older chicks’ |
‘sell her [praying emoji]’ |
‘damn your underage you sure?’ |
‘you are by far the SEXIEST, most beautiful stunning young woman ever, and OMG what an A$$$$$ [fire emoji] [fire emoji]’ |
‘She needs a second chance with a side of dick’ |
‘ik it’s grippy’ |
‘if she’s old enough to count she’s old enough to mount’ |
Table 5: Comments left on social media including themes of paedophilia
All comments in the table above were publicly posted by users on TikTok on videos about girls that had specified that they were under the age of 18 years old or visibly appeared to be a child. While there were many unique comments, as seen above, the most common words and phrases seen in the data were comments like, ‘I know it’s pink’ and ‘grippy’. Both of these comments were collected in the research hundred of times and make up most of the data set in this theme. The comments talking about ‘it being pink’ refers to the colour of the young girls genitals being a certain colour to signify their age and their anatomy child-like appearance. Comments by users using the phrase ‘grippy’ alludes to the ‘tightness’ of a child’s vagina and refers to a misogynistic stereotype of adult women being less sexually desirable due to having multiple sexual partners.
4.1.4 Themes of Sexual Violence
Almost 400 user generated comments collected as part of the sample contained sexually violent language or threats of sexual assault. While the theme of rape was previously identified, and could be included as part of the theme of sexual violence, data that was put into that category mostly explicitly contained mentions of rape or threats of rape. The data put into this category contains no mention of other forms of sexual violence. Like seen previously in other categories, users will abbreviate or substitute words or letters to get past harmful language algorithms.
Comments Left by Social Media Users: Themes of Sexual Violence |
‘explain why I shouldn’t be able to do whatever I want to women’ |
‘I get the urge to SA pathetic little girls like u everytime I’m near one’ |
‘maybe try not to look sexually appealing then? Like I don’t get what you want us men to do.’ |
‘if a women ever fausley accuses me of touching her, imma do it to her for real’ |
‘put yourself in a stupid situation and itll happen’ |
‘I have personally been accused of sa multiple times for going against feminist agenda’ |
‘who cares who got a link to the video’ (on a post about a female prisoner of war being sexually assaulted and tourtured’ |
‘if you’re a heavy sleeper I can basically do anything id like to you while you’re asleep’ |
‘bitch if you’re dressed half naked in public, don’t get mad wen I touch your body without consent cuz I believe that wat you wants’ |
’14 year old shouldn’t dress so skimperly around young impressionable boys it is corruption of the mind’ |
‘hey cutie let me sa you’ |
Table 6: Comments left on social media including themes of sexual violence
The abbreviation ‘SA’ is commonly used instead of ‘sexual assault’ across many social media platforms, including TikTok and was seen frequently throughout the social media data collection.
4.1.5 Themes of Physical Violence
There were 114 mentions of physical violence within the collected data, including threats of death or physical harm towards women. Whilst this category was one of the smallest among all categories, often the user-generated comments left in this category were usually the longest and featured many graphic fantasies of physically assaulting women.
Comments Left by Social Media Users: Themes of Physical Violence |
’Men can wipe women out if it comes to that and there’s nothing u can do to stop it. So better humble urself and enjoy the freedom men gave u’ |
’If us men collectively decided that women should be our slaves and have no rights, none of you would. There is a power indifference by design.’ |
’yall sometimes forget we could kill u and do all types of horrible messued up shit to you if we wanted be we don’t’ |
‘men want to commit shootings cuz of women bullying them or women in general’ |
‘did you forget that every single man within a square mile of you could absolutely pulverize you until the point you’re unrecognizable?’ |
’gonna beat my wife for this post’ |
‘wow another woman empowering tiktok, looks like I have to go out tonight and stalk and strangle a defenceless women to show dominance [laughing emoji] [laughing emoji]’ |
‘why wont we hear his side. Some women can provoke you to do what you can never imagine to do’ (left on a post about a man who killed his wife because she didn’t make food on time) |
’whenever I start panicking around a hot girl I just tell myself ‘‘why be afraid of a being that I can kill with my bear hands?’’ [skull emoji]’ |
‘if men wanted you all dead you would be dead remember we let you lady’s live on this planet’ |
‘saying you have never wanted to hit a woman in your entire life doesn’t make you a hero. It just makes you a liar’ |
Table 7: Comments left on social media including themes of physical violence
Many of the comments left in this category contained vivid fantasies about men ‘dominating’ women and being able kill and assault all women that they come across. There is also the idea that the men who are having these fantasies believe that they are in the majority and that all men have these same thoughts about women. They constantly dehumanise women. These users constantly dehumanise women, and often reference or refer to women in an objectifying way, one user in the data above refers to ‘hot girls’ as ‘a being’.
4.1.6 Themes of Generalised Misogyny
Over 400 user generated comments were classed as containing themes of misogyny that could not be placed inti another category. Many of the data collected, were words or phrases that may seem innocent without knowing the context or if someone were to be unfamiliar with the phrase. Additionally, user generated comments within this category are more likely to be explained away with them being classed as a joke and therefore data in this category are less likely to be picked up with any harmful language software or to be taken down after being reported.
Comments Left by Social Media Users: Themes of Misogyny |
‘U need to chill shes an independent woman [laughing emoji]’ (left on a post about a woman who was gang raped and toutured) |
‘skill issue’ |
‘lmao a soldier that can’t defend herself?’ |
‘womp womp’ |
‘W kill + bonus points + 2K XP points’ |
‘if my chats got leaked im going to the war crime tribunal’ |
‘imagine they heard all of our football locker room conversations back in HS’ |
‘I think it’s gross how young girls push out their periods and don’t hold them and wait until theyre older! They don’t need to have periods yet of they are 13! Why cant they wait to have them until they are 18?’ |
‘every man wants to see every woman naked. As long as shes not completely hideous. Even relative. [laughing emoji] Testosterone etc…’ |
‘plus it’s just our hormones, we cant easily control that Scheisse. It’s like y’all when y’all are your periods and act out’ |
‘marrying a virgin woman must be the same excitement as you’re buying a brand new car, not a second hand one’ |
‘woman don’t be president because there emotional creatures. Therefore they will throw a tantrum because there’s Starbucks order is wrong’ |
‘I hate when women drive’ |
‘no we don’t want you sexually experienced most of you can’t cook for shit, and domestic duties are fucking easy’ |
‘especially ugly fat ones wasting money on make up’ |
‘Did daddy mistreat you?’ |
‘we run the planet while y’all struggle to have brunch together without a fight’ |
‘Getting ‘’raped’’ is 2020s version of college lesbianism and bra burning. This girl is clearly a BPD slut who needs a toxic relationship, but then dolchstoslegende’d her boyfriend for kicks and virtue points. Like how are you ‘’raped’’ by boyfriend? Lmao. You already let him hit!?!?’ |
Table 8: Comments left on social media including themes of generalised misogyny
The first 5 pieces of data in the above table were posted by users in response to a post made on TikTok about a news article talking about a female soldier who was taken as a prisoner of war and raped and tortured. A few phrases in those comments are frequently seen posted on social media, ‘she’s an independent woman’ is often used by users in response to a woman being physically or sexually assaulted, meaning that woman do not deserve to be helped when being assaulted as woman fought to be ‘independent’. Users who leave these comments believe that it is women’s fault for any assault they may face for wanting independence from men.
Additionally, the phrase, ‘womp womp’ has gained traction across social media and is now one of the most commonly used new slang phrases, with the phrasing now becoming popular outside of social media in face-to-face conversations, mainly with teenagers and younger people. This phrase is an onomatopoeia, referring to the ‘sad trombone’ sound that is often used in television shows when someone loses something or something goes wrong. It has been popularised on social media as a way of sarcastically or disingenuously saying, ‘too bad’ or ‘oh well’ and ‘I don’t care’. Another commonly used phrase seen in user generated comments, is the term ‘bop’. This term officially refers to a woman who regularly gives oral sex on a male partner, with the term coming from the head motion. However, it has now come to be used on social media as a derogatory phrase to refer to any woman or girl that they perceive as provocative, sexually active or any woman or girl they believe is attractive under the preconceived notion that they engage in sexual activity with multiple partners. ‘Gyat’ or ‘gyatt’ is another term that is seen on social media frequently. It is originally an abbreviation for ‘goddam your ass thick’ and is used as a term on its own to usually refer to a woman who is curvy. This phrase is often used on its own or as a replacement for another word like, ‘got’; ‘I have gyatt to see this’, ‘I have gyatt something in my eye’. This phrase was seen being used on videos and photos of woman and girls of all ages, including toddlers wearing nappies.
4.2 Quantitative Results on the Prevalence of Misogynistic Content
There were 121 individuals who participated in the survey on online misogynistic content and behaviour. Of that, 39.7% were aged 18-21, 33.1% were aged 22-27, 9.9% were aged 28-35, 9.9% were aged 36-45, 5% were aged 46-55, 2.5% were aged 56-65 and no participants were aged 66-75 or 76+.
Graph 1: Age of Participants Graph 2: Gender Identity of Participants
Additionally, 77.7% of participants identified as cisgender women, 13.2% as cisgender men, 0.8% as trans women, 0.8% as trans men, 4.1% as non-binary, 3.3% described their identity as other and no participants described their identity as gender-fluid. It was important to gather results using gender identity rather than ‘female’ and ‘male’ as misogyny intersects with gender identity in many different ways and participants experiences of online misogyny may differ depending on identity. For example, trans people may have different experiences before and after socially transitioning, and so it is important to explore the relationship between various gender identities and misogynistic behaviours.
107 participant, (88.4%) reported being an active user of social media, with 1 person, (0.8%) saying they were not and 15, (12.4%) saying they were somewhat an active user. Instagram was the most common platform among participants with 88.4% saying that they regularly using it. Reddit, was the least used platform with 26.4% regularly using it. TikTok was the third most used platform among participants, with 15.2% using it. The 18-21 age category reported using TikTok more than any other age group with 48.72% of participants using TikTok were aged 18-21. Graph 3: Social Media Regularly Used
Despite not being the most used platform, TikTok was rated the worst platform for seeing misogynist content with 53.3% considering it the worst. Additionally, 98.72% of participants that regularly use TikTok reported seeing content they described as misogynistic. This was the highest rating among any social media platform. Participants also deemed TikTok to be the worst platform when asked about seeing different types of misogynistic content online, in every category.
Graph 4: Platform with the Most Misogynistic Content
100% participants reported seeing sexist jokes or memes and stereotypes of women and girls on TikTok and 84.62% reported seeing content that would classify as harassment or threatening towards women and girls. Furthermore, 72.15% of participants reported seeing sexually violent content towards women and girls on TikTok. Furthermore, TikTok was deemed one of the most platform for it’s effectiveness in addressing misogynistic content online, with 42.5% saying they thought the platform was ‘Not effective at all’.
60.3% of participants reported seeing sexually violent content online. Transgender people were most likely to have witnessed content online that would classify as sexually violent towards women and girls with 100% of both trans men and women reporting yes. 80% of non-binary people reported seeing sexually violent content as well as 65.26% of cis-gender women. Cisgender men were the least likely to have seen sexually violent content towards women and girls with 25% saying yes and 43.75 saying they had never seen sexually violent content.
Graph 5: Gender and Sexually Violent Online Content
Graph 6: Gender and Frequency of Sexually Violent Content
31.25% of cis-gender women reported seeing sexually violent content on social media between more than once a day and once a week, compared to 17.64% of cis-gender men.
81.9% of participants believe that online misogynistic content and behaviours are influencing younger people to be making similar kind of content and harbouring misogynistic attitudes towards women and girls. Only 1.7% believe that it will not. Additionally, 79.2% of participants believe that misogynistic content they have already seen was made and uploaded onto social media by a minor, with 9.2% saying that when they come across misogynistic content it is almost always or frequently made by someone who appears to be under the age of 18. 94.2% of participants reported being concerned about the impact that misogynistic content is having on young people and 89.1% are concerned about this content having real-world consequences on women and girls. 70.8% also believe online misogynistic content and behaviours are becoming more normalised.
Graph 7: Concern of Impact of Misogynistic Content
4.3 Qualitative Results on Societal Opinions
The participants were asked 4 qualitative questions as part of the survey to gather wider information on their opinions and experiences of online misogynistic content and behaviour. When asked about their thoughts on social media companies doing more to protect users and young people from online harmful content, the general consensus was that companies should be doing more than they already are. Many were not sure of what companies could be doing, but were still not happy with the amount of online content that displayed misogynistic language and themes.
What Should Social Media Companies be Doing? |
‘I’m not too sure. But it is certain that such content exists and should be dealt with accordingly.’ |
‘I think it’s so normalised now that it’s almost taken as a “it’s just what it’s like online, just scroll past and ignore it.” This is so frustrating because young people are being exposed to these harmful, misogynistic ideas, but social media platforms don’t seem to worry about the repurcussions. I think they should do more to help protect women online, making them feel like it’s a safe space for them to use.’ |
‘If not for our generation, for the next. We are always left fighting for feminine rights, look at the chaos that unravelled at the golden globes, Barbie, Taylor Swift, reduced to blondes with boobs instead of women with drive. I hate it.’ |
‘Being an woman because a pressure when seeing these kind of posts. I feel disgusted, ashamed about it’ |
‘Microaggressions and jokes online can very quickly snowball into action in real life. It’s not just the fringes of the internet and society that perpetuate incel/pick up artist/sexist culture.’ |
‘There’s a difference between ‘free speech’ and amplifying dangerous attitudes towards women. On a less extreme level, lower level but constant sexist ‘jokes’ condition uses to think that it’s normal and influence unconscious thoughts and believes about the value of women and acceptable behaviour towards them.’ |
‘They’re literally brainwashing boys’ |
‘Because social media algorithms seem to not only not stop misogynistic content, but actually propagate it and make it reach more people. Their inaction has had a negative effect on many young teenage boys’ perspective of women.’ |
‘As a trainee teacher I have seen children as young as 4yrs influenced by people such as Tate. Social media companies should do more to make social media a safer and more comfortable environment for all.’ |
Because thats not the problem. People are focusing on the completely wrong thing. If you wanna adress a problem, you go for the root not the branches.If you really wanna learn about how men work, behave and feel what they feel, you need to start from the other end.Men have for the past years been demasculinized which is a greater threat to society than strong, masculine alpha males.If you wanna learn more, start with reddithttps://www.reddit.com/r/NoFap/ https://www.reddit.com/r/Semenretention/ There is a war going on against men and it has been going on for a long period of time. This groups shows you the few procent that are trying to recover from the root suffering to all of mens problems. Start there |
‘I can’t really say, as I suspect their filters are giving this content to people who engage with this type of content – hence I don’t see it, and can’t say what they should do more or how big of an impact it’s having’ |
Table 9: Responses on what social media companies should be doing
With over 80 responses, many participants felt strongly about the harm that misogynistic online content is having on both normalising misogynistic behaviour and on their own mental health and self worth including reporting that this kind of content makes them feel ‘ashamed’ and ‘disgusted’. Teachers and people who work with children are report their concern about the influence of online misogyny on children as young as 4 years old. There were responses from a participant, who believed that the real issue was men who have been ‘demasculinized’ and thought that research such as this contributed to a ‘war on men’. In one comment, they included links to Reddit forums on semen retention and the benefits of men refraining from masturbating.
There were 91 responses by participants to being asked about the type of content they believe are influencing younger people. The majority of responses featured podcasts and TikTok videos of men who describe themselves as ‘alpha males’.
Type of Content Influencing Young People |
‘Alpha male content is big just now with male youtubers and tiktok stars. as well as the podcasts/youtube videos that are literslly just rating girls – and that the girls thst go along with ir are always favoured’ |
‘“Alpha males”, men promoting incel behavior, videos of women being “proper women” and “homely”’ |
‘Things like Andrew Gate’s podcasts and videos which are breeding “toxic masculinity” back in to young boys, leading them to perceive girls negatively/like objects/like trophies. The whole “male mental health” epidemic which gets discussed only when women are talking about how they suffer under the patriarchy, which just seems to tell women that they need to pander to men’s needs and issues before they can talk about their own. Also the ease of finding sexualised content of young girls on TikTok – it comes up on my for you page and I don’t ever interact with this sort of content.’ |
‘Tate makes women seem inferior. Golden globes antagonising women AGAIN.“A man can react, however a woman can only overreact” she said it ALL.’ |
‘comedy stand up posted online (Matt Rife is a great example of this) as well a people disguising misogyny in jokes especially on tiktok which makes the younger generation believe it is just funny without seeing the context and severity of what is being said. Plus so many people are bystanders in this and take no action in correcting it therefore normalising it even more’ |
‘Videos aimed towards younger boys, especially surrounding football and women’s sport in general’ |
‘It used to be videos/podcasts 100% but now i think its so commonly done via tweeting, its honestly insane how far its gotten where people will so openly do it via tweets yet have no punishments.’ |
‘Kids see memes and parrot them like they’re their own ideas – jokes have the most power with them.’ |
‘Jokes that are not obviously misogynistic but use stereotypes, micro aggressions etc. TikToks, podcasts’ |
‘Shorts, like the TikTok style videos. Misogynistic videos have been propagated so much more rapidly with TikTok-like platforms gaining relevance. An important aspect is the comment sections of these videos, under nearly every reel which has a woman in it on instagram is some misogynistic comment which is played for laughs’ |
Table 10: Types of misogynistic content
Podcasts, Instagram Reels and TikTok’s were among the most cited forms of content that participants reported having the most misogynistic content. Short form content, or clips from long form content that can be shared and viewed quickly, appear to have the most misogynistic content.
When asked if there were any specific people or content creators that participants were concerned of, 54 out of 74 responses named Andrew Tate as a person of concern. Matt Rife, Donald Trump, Kanye West and JustPearlyThings (Peal Davis) were also names that were mentioned frequently.
Content Creators Of Concern |
‘The obvious, Andrew and Tristan Tate but there are others. Jordan Peterson, whilst not the most harmful figure by himself is often the start of a rabbit hole leading to more extreme right wing misogynist views and Twitter accounts like “Women posting their Ls” and other such creators often publish content harmful to women with the “it’s just a joke!! Stop being so sensitive!!” Kind of tone.’ |
‘I think Andrew Tate has been one who’s blown up due to his controversial opinions.’ |
‘Andrew Tate is obviously a big one, but aside from this it typically comes from a certain category of straight, cis white men, or ‘incels’, though specifics individuals is hard as I actively avoid the content’ |
‘I am not the type of person who follows a specific content creator, even less so one that does this kind of content, as I simply do not like to watch that content. HOWEVER, Tate appeared for a while on 9GAG, and it was everywhere, I really dislike watching his content and it was indeed very sexist and misogynistic. I am concern with his content since the magic boom it made means that a lot of people started watching, but this stopped, I haven’t seen posts of him in a good time’ |
‘I am not the type of person who follows a specific content creator, even less so one that does this kind of content, as I simply do not like to watch that content. HOWEVER, Tate appeared for a while on 9GAG, and it was everywhere, I really dislike watching his content and it was indeed very sexist and misogynistic. I am concern with his content since the magic boom it made means that a lot of people started watching, but this stopped, I haven’t seen posts of him in a good time’ |
‘Logan Paul, Andrew Tate, Ben Shapiro’ |
‘I’ve heard real anecdotes about men who started listening to Andrew Tate as a joke with their mates only to end up with misogynistic beliefs in their heads because the way he expresses them sounds like a statement of fact rather than an opinion.’ |
‘I’ve heard real anecdotes about men who started listening to Andrew Tate as a joke with their mates only to end up with misogynistic beliefs in their heads because the way he expresses them sounds like a statement of fact rather than an opinion.’ |
‘Andrew Tate – it feels people perceive him as just a “meme” but he’s so much more dangerous than that.’
Table 11: Content creators of concern
Many participants appear to be concerned with creators that market their content as ‘jokes’, believing that they cause more damage than good. This survey was also conducted shortly after comedian Matt Rife went viral online due to the release of his comedy special on Netflix. In that, Rife made a joke within the first 10 minutes, about a waitress who had a black eye. His joke revolved around stating that the waitress would not have had a black eye if she knew how to cook. This may have increased the responses from participants naming Rife.
Finally, the participants were asked for any final comments.
Additional Comments or Insights |
‘I follow only close friends (around 100 people) so my answers are bias towards that. I’m selective on who I follow because indeed I don’t want to see any content that is harmful for me or others.’ |
‘“Why men hate women” by Laura Bates as a lot of information about misogynistic content on social media, looking at incels, pick up artists, men’s rights activists etc. Social media use is not regulated enough, particularly for young people who tend to be more impressionable. Online pornography is also a concern as often times it is not regulated on social media websites and perpetuates misogynistic views. Porn containing misogynistic actions and speech is posted rapidly and in high volume, often containing the abuse/acted abuse of women and girls which is becoming normalised.’ |
‘Content creators like these create a ‘safe space’ for abuse and misogyny – it’s deemed as acceptable.’ |
‘I think that the porn Industry being on the internet has promoted the sexualisation of women anytime they are on the internet’ |
‘It’s so easy for people without strong views to see a few jokes and think that they’re learning what the rest of the world thinks, and so what they should think too. Social media is so compartmentalized and bubbled, you get into the wrong bubble and the algorithms will only show you people who think the same way. It’s a huge problem for contemporary discourse on every level, but I think it’s doing irreparable harm to a generation who will grow up without reading newspapers or watching political TV shows.’ |
‘It’s dangerous and influencing younger individuals to the point it is normalized in societies’ |
‘I also think the negative impact misogynistic content has on young boys is essential to prioritise, as all children face the consequences of misogyny within society and from their environments.’ |
‘it’s too accessible for young people to see anything online, and it’s too easy to post anything you want without consequence’ |
‘These mysognistic videos have reached boys as young as 10, one of my mates is a primary school teacher, and spoke of the absolute hatred of women which these Andrew rate viewers seem to display. Scary stuff’ |
‘I think while there are actually misogynist online, there are some contents that are taken out of context and aren’t misogynistic but might be labelled as one. So I want people to clearly be able to differentiate before calling anyone or categorizing a content as misogynistic’ |
‘I’d imagine that misogynistic individuals are more likely to see this content, which will further strengthen their misogynistic ways and create the problems of violence as you’re suggesting. I suspect it’s an echo chamber type of effect, hence I don’t experience it personally’ |
‘The line between censorship and preventing online hate is very thin,we should be careful to do the right thing.’ |
Table 12: Additional Comments or Insights
Chapter 5: Discussion
5.1 Analysis of Results
The social media data and survey results have highlighted the pressing issue of online misogynistic content and the intersectionality of societal factors contributing to violence against women and girls. VAWG remains a pervasive and deeply engrained issue in society, with widespread real-world consequences for all individuals and communities.
Prevalence of misogynistic behaviours experienced by participants:
94.2% had experienced misogynistic content on social media.
97.5% had seen sexist jokes or memes.
76.9% had witnessed harassment or threatening behaviour towards women and girls.
98.3% had seen content stereotyping women.
97.5% had encountered objectifying content about women.
60.3% had seen sexually violent content .
The majority of participants had experienced various forms of misogynistic content online, this overwhelming majority shows the pervasive nature of such content and suggests that misogyny is a common feature of online interactions for most users. It indicates a significant issue that needs addressing to create safer online spaces. Additionally, The near-universal presence of sexist humour online normalises derogatory attitudes towards women, potentially desensitising users to gender discrimination and reinforcing harmful stereotypes. A high percentage of threatening behaviour reflects a serious issue where a significant portion of users observe direct aggression towards women, contributing to an environment of fear and intimidation, discouraging women from participating fully in online spaces. The almost universal exposure to content that reinforces traditional gender roles and stereotypes highlights the extent to which social media perpetuates outdated and harmful views of women, limiting the perception of women’s roles in society and undermining gender equality. Furthermore, the prevalence of content that reduces women to mere objects for male pleasure is indicative of a broader cultural issue. Objectification can contribute to the dehumanisation of women and justify gender-based violence and discrimination. The participant’s exposure to sexually violent content, while slightly lower than other forms of misogynistic content, is nonetheless alarmingly high. This type of content not only traumatises those who encounter it but also perpetuates a culture that trivialises or even condones sexual violence.
The qualitative responses from the survey provide a diverse range of opinions regarding online misogynistic content and the role of social media companies in addressing it. Most participants agreed that social media companies need to take more action to curb the spread of misogynistic content. Despite the consensus, there was uncertainty about specific measures that could be implemented with many participants expressing frustration and a sense of helplessness, acknowledging the normalisation of misogyny online. However, participants mostly agreed there is a need for platforms to create safer spaces, especially for women and young users. Participants reported that exposure to misogynistic content negatively impacts their mental health and self-esteem. They expressed feelings of disgust and shame when encountering such content and some responses highlighted the emotional toll and the sense of being pressured or demeaned by misogynistic posts. Additionally, teachers and individuals working with children expressed concern about the early age at which children are exposed to these harmful ideas. The participants identified various types of content that they believe influence younger audiences, particularly “alpha male” content, podcasts, and TikTok videos. Andrew Tate was frequently mentioned as a content creator of concern. Participants also noted that seemingly harmless content to some people, like jokes and memes, can perpetuate harmful stereotypes and normalise misogynistic attitudes.
The analysis of social media comments reveals alarming trends of misogynistic content, particularly focusing on themes of rape, incest, paedophilia, sexual violence, physical violence, and generalised misogyny. The data points towards a culture of hostility and violence against women and girls online. The data includes numerous instances where users use euphemisms like “grape” or emojis to avoid content filters, fantasies of sexual assault also use coded language to avoid detection by content filters. The use of abbreviations like “SA” for sexual assault to evade detection not only highlights the prevalence of misogynistic attitudes but also indicates a level knowledge in evading moderation, pointing to the need for more sophisticated content moderation systems. These comments trivialising rape and assault, contribute to a culture that normalises sexual violence, and reflect deeply ingrained misogynistic beliefs. The popularity of these comments, evidenced by thousands of likes, suggests a disturbing acceptance and even encouragement of such attitudes within certain online communities. Comments related to incest often involved sexually explicit and violent language directed towards family members. These posts typically came from adult males and included disturbing sexual fantasies about daughters and sisters. Additionally, paedophilic content was among the most common, with comments often left on videos or photos of young children. The language used is explicit and disturbing, indicating a normalised acceptance of sexual violence against children. The frequency and explicit nature of these comments highlight an urgent need for better protection and moderation to safeguard minors online. Although fewer in number, comments detailing violent fantasies about assaulting and dominating women reinforce the notion of male superiority and entitlement to commit violence against women. The general misogynistic comments were often disguised as jokes or casual remarks but carried harmful stereotypes and degrading views about women. These comments contribute to a broader culture that diminishes the seriousness of violence against women and perpetuates harmful stereotypes.
The issue of Violence Against Women and Girls (VAWG) is deeply rooted in societal norms and attitudes that perpetuate male entitlement to women’s bodies and sexuality. From a young age, boys are socialised to view women as objects, leading to a sense of entitlement and control over women. This is heightened by the prevalence of misogynistic content on platforms like TikTok, where graphic and dehumanising rhetoric is normalised. Easy access to violent pornography further contributes to the normalisation of misogynistic attitudes and behaviours, especially among young boys who consume such content. Mainstream media and popular culture also play a role in reinforcing harmful stereotypes and toxic masculinity, which perpetuate VAWG. Additionally, online communities like incel forums provide validation for misogynistic beliefs as well as influencers and content creators on social media platforms that often promote toxic ideals of masculinity, influencing impressionable audiences, particularly young boys. This normalisation of misogynistic attitudes online has real-world consequences, as evidenced by survey data and reports of young children exhibiting misogynistic behaviour. Overall, misogynistic attitudes are becoming normalised and perpetuated online, posing significant risks to the safety and well-being of women and girls.
5.2 Recommendations for Addressing Online Misogyny
Addressing online misogyny requires a comprehensive approach that includes technological, educational, and policy based measures. First, social media platforms must enhance their content moderation systems. This includes utilising advanced algorithms to detect and remove misogynistic content swiftly and investing in human moderators to handle nuanced cases. Platforms should also implement stricter policies and penalties for users who perpetuate misogyny, ensuring consistent enforcement to deter repeat offenders.
Education plays a crucial role in combating online misogyny. Schools and community programs should incorporate digital literacy and online behaviour into their education, teaching young people about the impacts of online harassment and the importance of gender equality. Comprehensive sex and relationship education should also be implemented, to ensure consent and respectful behaviour is ingrained in children as a priority, with a focus on educational awareness into misogynistic behaviours and violence against women. Additionally, public awareness campaigns can highlight the harmful effects of online misogyny and encourage bystanders to speak out against it.
Legislative action is also necessary. Governments should enact and enforce laws that specifically address online harassment and hate speech, providing clear legal consequences for perpetrators. Governments should also include misogyny and misogynistic language in hate crime laws. Strengthened laws and regulations should be put into place to protect children from accessing and viewing online adult content and violent pornography. Collaboration between governments, tech companies, and organisations can lead to the development of comprehensive strategies to tackle online misogyny
Finally, support systems for victims of online misogyny should be strengthened, offering accessible reporting mechanisms, psychological support, and legal assistance.
Chapter 6: Conclusion
The investigation into social media, specifically platforms like TikTok, and the prevalence of VAWG, reveals a disturbing trend enabled by user-generated misogynistic content and behaviours. The qualitative and quantitative survey data provide compelling evidence that a significant majority of participants have encountered misogynistic content online. Additionally, many express deep concern over the influence of figures like Andrew Tate on young boys, who are at an important stage in developing their views on gender and relationships. These findings prove the urgent need to address the pervasive nature of online misogyny and its real-world consequences, perpetuates harmful stereotypes and reinforces gender hierarchies, contributing to a culture where violence against women is trivialised or condoned.
The survey data highlights a critical perspective of this issue; the direct experiences of individuals who consume content on social media. Many respondents reported not only encountering misogynistic content but also feeling its impact on their perceptions and interactions. Additionally, the qualitative data shed light on the ways that online misogyny manifests.
The responsibility of social media platforms should not be overlooked. Their algorithms and content moderation policies significantly influence the type of content that users are exposed to. Platforms like TikTok must take proactive measures to identify and limit the spread of misogynistic content. This includes improving their content moderation process and ensuring that their policies are effectively enforced. Additionally, there is a need for greater accountability and transparency, regarding how platforms handle harmful content and the measures they are taking to protect users. Educational initiatives also play an important role in reducing the impact of online misogyny. Schools, parents, and community organisations must work together to educate young people about the dangers of internalising harmful gender stereotypes and the importance of respecting all individuals regardless of gender.
In conclusion, the evidence from this study shows the significant impact of user-generated misogynistic content on social media and its contribution to the perpetuation and normalisation of VAWG. Figures like Andrew Tate proves the influential power of online influencers in shaping young minds and attitudes. Addressing this issue requires an intersectional approach, including stricter regulation and moderation of content on social media platforms, robust educational efforts, and a societal commitment to challenging and dismantling misogynistic rhetoric. Only through concerted and sustained efforts can we hope to create a safer and more equitable online environment for all users.
Bibliography
Baele, S., Brace, L. & Ging, D. (2024) A diachronic cross-platforms analysis of violent extremist language in the incel online ecosystem. Terrorism and Political Violence, 36 (3), 382-405.
Bates, L. (2020) Men who hate women.
Boucher, V. (2022) Down the TikTok rabbit hole: Testing the TikTok algorithm’s contribution to right wing extremist radicalization. Queen’s University (Canada).
Burch, A. D. S. & Mazzei, P. (2018) Death Toll Is at 17 and Could Rise in Florida School Shooting, -02-14 [Online]. Available at:https://www.nytimes.com/2018/02/14/us/parklandschool–shooting.html.
Cecco, L. (2019) Toronto van attack suspect says he was ‘radicalized’ online by ‘incels’, -09-
27T15:42:37.000Z [Online]. Available at:https://www.theguardian.com/world/2019/sep/27/alek–minassian–toronto–van–attackinterview–incels.
Children’s Commissioner, (2023a), ‘A lot of it is actually just abuse’ young people and pornography. Available online: https://assets.childrenscommissioner.gov.uk/wpuploads/2023/02/cc–a–lot–of–it–isactually–just–abuse–young–people–and–pornography–updated.pdf .
Children’s Commissioner, (2023b), Evidence on pornography’s influence on harmful sexual behaviour among children. Available online: https://assets.childrenscommissioner.gov.uk/wpuploads/2023/05/Evidence–onpornographys–influence–on–harmful–sexual–behaviour–among–children.pdf [Accessed: Apr 12, 2024].
Chrisler, J. C. & Ferguson, S. (2006) Violence against women as a public health issue. Annals of the New York Academy of Sciences, 1087 (1), 235-249.
Cole, S. (2018) We are truly fucked: Everyone is making AI-generated fake porn now.
Das, S. (2022) How TikTok bombards young men with misogynistic videos. Available online:https://www.theguardian.com/technology/2022/aug/06/revealed–how–tiktokbombards–young–men–with–misogynistic–videos–andrew–tate .
Dewey, C. (2021) Inside the ‘manosphere’ that inspired Santa Barbara shooter Elliot Rodger, –
10-26 [Online]. Available at:https://www.washingtonpost.com/news/theintersect/wp/2014/05/27/inside–the–manosphere–that–inspired–santa–barbara–shooter–elliotrodger/.
Domestic abuse act 2021. (2021) King’s Printer of Acts of Parliament.
Donovan-Smith, O. (2019) He pledged to kill ‘as many girls as I see’ in mass shooting. after second chances, he’s going to prison.
Ellsberg, M., Arango, D. J., Morton, M., Gennari, F., Kiplesund, S., Contreras, M. & Watts, C. (2015) Prevention of violence against women and girls: What does the evidence say? The Lancet, 385 (9977), 1555-1566.
Faragó, T. (2019) Deep fakes–an emerging risk to individuals and societies alike.
Fonrouge, G. & Brown, R. (2018) Alleged school shooter was abusive to ex-girlfriend: Classmate.
Hackett, S., (2014), Children and young people with harmful sexual behaviours. Available online:https://tce.researchinpractice.org.uk/wpcontent/uploads/2020/05/children_and_young_people_with_harmful_sexual_behaviours_research_review_2014.pdf .
HELMUS, T. C., (2022), Artificial intelligence, deepfakes, and disinformation; A primer. RAND Corporation, Available online: http://www.jstor.org/stable/resrep42027 .
Hill, K. (2014) The disturbing internet footprint of santa barbara shooter elliot rodger. Available online:https://www.forbes.com/sites/kashmirhill/2014/05/24/thedisturbing–internet–footprint–of–santa–barbara–shooter–elliot–rodger/ .
Internet and social media users in the world 2024. (2024) Available online:https://www.statista.com/statistics/617136/digital–population–worldwide/ .
Johnson, K., Hughes, T. & Madhani, A. (2015) Oregon community college shooter was bitter.
Johnson, R. B. & Onwuegbuzie, A. J. (2004) Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33 (7), 14-26.
Kane, G. C., Alavi, M., Labianca, G. & Borgatti, S. P. (2014) What’S different about social media networks? A framework and research agenda. MIS Quarterly, 38 (1), 275-304.
Kashyap, R., Fatehkia, M., Tamime, R. A. & Weber, I. (2020) Monitoring global digital gender inequality using the online populations of facebook and google. Demographic Research, 43 779-816.
Kassam, A. (2018) Woman behind ‘incel’ says angry men hijacked her word ‘as a weapon of war’, -04-26 [Online]. Available at:https://www.theguardian.com/world/2018/apr/25/woman–who–invented–incelmovement–interview–toronto–attack.
Kelly, C. R. (2020a) Incel rebellion: Fascism and male autarky. In Anonymous Apocalypse man. Ohio State University Press, 83-104.
Kelly, C. R. (2020b) The red pill: The new men’s rights rhetoric. In Anonymous Apocalypse man. Ohio State University Press, 59-82.
Leeming, M. C. (2023) Radicalised masculinity: Ontological insecurity, extremist ideologies and the rise of andrew tate.
Lin, J. L. (2017) Antifeminism online; MGTOW (men going their own way). In Urte Undine Frömming, Steffen Köhn, Samantha Fox and Mike Terry (eds) Digital environments. Transcript Verlag, 77-96.
Lindsay, A. (2022) Swallowing the black pill: Involuntary celibates'(incels) anti-feminism within digital society. International Journal for Crime, Justice and Social Democracy, 11 (1), 210-224.
Machette, A. T. & Montgomery-Vestecka, G. (2023) Applying sexual scripts theory to sexual communication discrepancies. Communication Reports (Pullman, Wash.), 36 (2), 123-135.
McKee, A., Byron, P., Litsou, K. & Ingham, R. (2020) An interdisciplinary definition of pornography: Results from a global delphi panel. Archives of Sexual Behavior, 49 (3), 10851091.
Meskys, E., Kalpokiene, J., Jurcys, P. & Liaudanskas, A. (2020) Regulating deep fakes: Legal and ethical considerations. Journal of Intellectual Property Law & Practice, 15 (1), 24-31.
Neufeld, D. (2021) The 50 most visited websites in the world. Available online:https://www.visualcapitalist.com/the–50–most–visited–websites–in–the–world/ .
News, A. B. C. (2009) Pa. gunman ‘hell-bent’ on killings, had 4 guns. Available online:https://abcnews.go.com/US/story?id=8255530&page=1 .
Oxford english dictionary. Available online: https://www.oed.com/ .
Oxford learner’s dictionaries. Available online: https://www.oxfordlearnersdictionaries.com/ .
Pelzer, B., Kaati, L., Cohen, K. & Fernquist, J. (2021) Toxic language in online incel communities. SN Social Sciences, 1 1-22.
Rhymes, J. (2023) Scrolling is not extended mind-wandering: How TikTok’s’ for you,’andrew tate, and the attention economy are jeopardizing user autonomy online.
Saul, H. (2015) Teenager Ben Moynihan sentenced to 21 years for attempted murder of three women because he could not lose his virginity | The Independent, -03-07 [Online]. Available at:https://www.independent.co.uk/news/uk/crime/teenager–ben–moynihan–sentenced–to21–years–for–attempted–murder–of–three–women–because–he–could–not–lose–his–virginity10091277.html.
Short, E., Brown, A., Pitchford, M. & Barnes, J. (2017) Revenge porn: Findings from the harassment and revenge porn (HARP) survey–preliminary results. Annual Review of CyberTherapy and Telemedicine, 15 161-166.
Simon, W. & Gagnon, J. (2003) Sexual scripts: Origins, influences and changes. Qualitative Sociology, 26 .
Smith, H. and Mansted, K., (2020), What’s a deep fake? Australian Strategic Policy Institute, Available online:http://www.jstor.org/stable/resrep25129.6 .
Sparks, B., Zidenberg, A. M. & Olver, M. E. (2022) Involuntary celibacy: A review of incel ideology and experiences with dating, rejection, and associated mental health and emotional sequelae. Current Psychiatry Reports, 24 731-740.
Speckhard, A., Ellenberg, M., Morton, J. & Ash, A. (2021) Involuntary celibates’ experiences of and grievance over sexual exclusion and the potential threat of violence among those active in an online incel forum. Journal of Strategic Security, 14 (2), 89-121.
Taylor, J. and Shrive, J., (2021), ‘I thought it was just apart of life’: Understanding the scale of violence committed against women in the UK since birth. Available online:https://irp.cdnwebsite.com/4700d0ac/files/uploaded/I+thought+it+was+just+a+part+of+life++%C2%A9VictinFocus.pdf.
Taylor, J. (2018) The woman who founded the ‘incel’ movement, 30 August [Online]. Available at:https://www.bbc.co.uk/news/world–us–canada–45284455.
Taylor, J. F. (2010) Children and young people accused of child sexual abuse: A study within a community. Journal of Sexual Aggression, 9 (1), 57-70.
Tierney, A. (2018) Edmonton man uses ‘Involuntary celibacy’ as excuse in stomping death.
UN Women, (2021), Prevalence and reporting of sexual harassment in UK public spaces. Available online:https://www.unwomenuk.org/site/wpcontent/uploads/2021/03/APPG–UN–Women–Sexual–Harassment–Report_Updated.pdf .
Venkatesh, V., Brown, S. A. & Bala, H. (2013) Bridging the qualitative-quantitative divide: Guidelines for conducting mixed methods research in information systems. MIS Quarterly, 37
(1), 21-54.
Vera-Gray, F., McGlynn, C., Kureshi, I. & Butterby, K. (2021) Sexual violence as a sexual script in mainstream online pornography. The British Journal of Criminology, 61 (5), 1243-1260.
Vincent, J. (2018) Watch jordan peele use AI to make barack obama deliver a PSA about fake news. Available online:https://www.theverge.com/tldr/2018/4/17/17247334/ai–fake–newsvideo–barack–obama–jordan–peele–buzzfeed .
Vizard, E., Hickey, N., French, L. & Mccrory, E. (2007) Children and adolescents who present with sexually abusive behaviour: A UK descriptive study. Journal of Forensic Psychiatry & Psychology – J FORENSIC PSYCHIATRY PSYCHOL, 18 59-73.
Wagner, T. L. & Blewer, A. (2019) “The word real is no longer real”: Deepfakes, gender, and the challenges of ai-altered video. Open Information Science, 3 (1), 32-46.
What is gender-based violence? – european commission. Available online: https://commission.europa.eu/strategy–and–policy/policies/justice–and–fundamentalrights/gender–equality/gender–based–violence/what–gender–based–violence_en.
Woolf, N. (2015) Chilling report details how Elliot Rodger executed murderous rampage, -0220 [Online]. Available at:https://www.theguardian.com/us–news/2015/feb/20/mass–shooterelliot–rodger–isla–vista–killings–report.
World Health Organisation (2024) Violence against women. Available online: https://www.who.int/news–room/fact–sheets/detail/violence–against–women .
Ybarra, M. L., Mitchell, K. J., Hamburger, M., Diener-West, M. & Leaf, P. J. (2011) X-rated material and perpetration of sexually aggressive behavior among children and adolescents: Is there a link? Aggressive Behavior, 37 (1), 1-18.