REPORTS
Digital Battlegrounds
Myanmar Witness
25 Jan 2023
Report Published:
Gendered hate speech report on the politically motivated abuse of Myanmar women online
Against the backdrop of increasing political repression, gaps in accountability by social media platforms operating in Myanmar, and abuses highlighted by women’s rights groups, Myanmar Witness launched a mixed-methods study into politically-motivated, online abuse targeting women in and from Myanmar and the impact on their lives. The research was carried out in partnership with Sisters to Sisters, a grassroots organisation supporting women in and from Myanmar.
The research looked at: i) the scale and nature of the abuse; ii) the perpetrators of abuse; iii) the online and offline impact of the abuse on women’s lives; and, iv) social media platform accountability. The findings are based on:
A quantitative study of 1.6 million Telegram posts by 100 Myanmar-language Telegram channels.
A qualitative analysis of 220 posts, predominantly from Facebook and Telegram.
Five in-depth case study analyses investigating the relationship between online and offline abuse, and abuse targeting prominent Myanmar female politicians.
Five in-depth interviews with survivors of politically-motivated, online abuse.
The time period assessed falls between February 2021 and December 2022.
Update: Myanmar Witness and the BBC made contact with social media platforms and shared the report findings and examples of abusive posts and channels with them. As of 25 January, 2023, Telegram and Meta appear to have taken down the majority of abusive posts and channels identified during this investigation.
Warning: While this report has made every effort to minimise use and exposure to graphic imagery, it does contain content relating to racist, misogynistic and homophobic abuse that some readers may find distressing.
Summary of Key Findings
Since the military coup of 1 February 2021, women in and from Myanmar have used social media powerfully as a means of expressing their political views. In doing so, they have faced growing levels of online abuse and harassment.
Quantitative analysis of 1.6 million Telegram posts found that politically-motivated, online abuse of Myanmar women was at least five times more prevalent at the end of 2022 compared with the weeks following the coup. The overall prevalence of abusive posts targeting women on Telegram was up to 500 times higher than international baselines for abuse prevalence on social media, where these exist.
Qualitative case study analysis and survivor testimony speak to the volume and severity of online abuse. However, without full access to platform data it is impossible to accurately assess the true scale or prevalence of abuse. This is particularly relevant for Myanmar’s most widely used social media platform, Facebook, where Meta’s data access policy prevented large-scale quantitative analysis.
Politically-motivated abuse occurs within a wider online environment of abuse and privacy violations targeting women and girls. Myanmar Witness’ quantitative study uncovered up to 8,338 abusive Telegram posts targeting women with hateful rhetoric and up to a further 15,000 doxxing posts, many of which appear to target women for their political beliefs and activities.
The overwhelming majority of abusive posts were authored by male-presenting profiles supportive of Myanmar’s military coup and targeted women who opposed the coup.
90% of abusive posts in the qualitative analysis were authored or shared by pro-SAC (State Administration Council) accounts. Just under 80% of abusive posts in the quantitative analysis were authored by pro-SAC channels.
83% of posts analysed in the qualitative investigation were directed at women who support the Myanmar National Unity Government (NUG) or People’s Defence Forces (PDF).
Male-presenting accounts were responsible for 70% of abusive posts in the qualitative study.
Doxxing is the main form of abuse and appears linked to offline violence and arrests targeting women who oppose the SAC. There is some evidence of coordination among and between online abusers and Myanmar security forces to facilitate violence and arrests.
At least 50% of the abusive posts in the qualitative study were doxxing women. Doxxing was also prevalent within the quantitative dataset and survivor testimony. Case-study analysis found women were targeted for doxxing attacks at a considerably higher rate than men with the same political profile and visibility online. Many targeted women were not well-known, and appear singled out simply for positively commenting on pro-PDF or NUG posts.
28% of all doxxing posts analysed in the qualitative study include an explicit call for the targeted women to be punished offline. Almost all of these called on Myanmar military authorities to arrest the targeted woman and/or seize her property.
There was evidence showing coordination of doxxing campaigns across pro-SAC Telegram channels, through the frequent sharing and mutual amplification of doxxing posts. Some pro-SAC Telegram channels appear to be coordinating with the SAC, doxxing women who oppose the SAC, proactively alerting the SAC, and celebrating news of the women’s arrests.
Language that sexualises women is used to shame and humiliate women in an attempt to silence them. Sexualised disinformation narratives are used to undermine politically-active women, consistent with narratives perpetuated by the official SAC media of pro-opposition women as morally corrupt and racially impure.
Sexualised disinformation narratives depict female PDF and NUG supporters as morally corrupt and promiscuous and as sexual prey for PDF and ethnic armed organisation (EAO) leaders and foreigners. The online attacks are often in coded slang that is extremely vulgar, and perpetuates attitudes that normalise and trivialise sexual abuse.
These narratives are perpetuated and endorsed by official SAC media. This plays into a moral panic around PDF groups implicitly tied to religious values and paternalistic views of purity, drawing on ultranationalist rhetoric.
Use of dehumanising sexualised language and imagery mirrors tactics known to have been used by the Myanmar military to dehumanise the Rohingya population.
There is evidence of abuse targeting women from minority ethnic or religious backgrounds, women who were perceived as too lenient towards Muslim minority groups and, because of their perceived sexuality.
Qualitative analysis showed sexualised, anti-Muslim rhetoric aligned with ultra-nationalist narratives were used in an attempt to discredit prominent pro-democracy women. This included claims using explicit language stating that the women were having sexual relations with Muslim men.
The level of anti-Muslim hate directed towards politically active women in pro-SAC groups on Telegram was approximately 25 times higher than within pro-democracy groups. However, within the quantitative and qualitative analysis, the overall number of posts containing anti-Muslim hate directed at women, or appearing to target women from minority ethnic or religious backgrounds, was low. This likely underrepresents the scale of anti-Muslim rhetoric in the social media ecosystem and may stem from research design created to limit the number of false positives in the quantitative findings.
6% of posts within the qualitative study contained anti-LGBTQIA+ rhetoric. This was the only category of abuse analysed which was more often carried out by pro-democracy accounts. The majority of anti-LGBTQIA+ hate identified targeted SAC politician Thet Thet Khine.
Online abuse and doxxing attacks are having a silencing effect and causing women to retreat from public life. Survivors report attacks on their views, person and dignity, and threats of rape, death and violence with severe emotional and psychological impacts.
In-depth interviews with survivors found that online abuse - doxxing attacks in particular - are leading to women retreating from public life and censoring themselves from public discussions on and offline. Survivors report living in fear; facing difficulties with friends and family and; experiencing feelings of shame, depression and distress.
There is some further evidence of this in the qualitative study, with a number of female survivors of doxxing removing or making changes to their accounts which reduce their public presence online. However, it is unclear how many women who were doxxed knew that this had occurred before taking measures to protect themselves.
The findings of this report are likely the tip of the iceberg in terms of the scale and severity of the abuse affecting women in Myanmar.
The findings in this report are based on publicly available posts on platforms of interest. Interviews with survivors of abuse highlighted significant abuses occurring in other forums.
Abuse in other forms included: i) the posting of images and contact details to adult sites; ii) abusive messages and threats sent in private groups, direct messages and via messaging apps such as WhatsApp; and, iii) the existence of Telegram channels dedicated to sexually explicit material (excluded from this study due to safeguarding concerns).
Survivors feel strongly that social media platform moderation practices have been inadequate in stopping the abuse. The majority of abusive posts analysed were in clear violation of platform terms and conditions and driven by a relatively small number of highly active accounts and channels.
In the quantitative study, just four Telegram channels were responsible for 50% of abusive posts detected. Across the qualitative and quantitative studies, 13 Telegram channels with a combined following of more than 150,000 were responsible for a majority of the abuse.
A majority of prolific doxxing channels remain active on Telegram despite acting in clear breach of platform policy. Takedowns of a small number of channels are having limited positive effect. This is due to the prevalence of backup channels and a lack of swift action to tackle new channels created by known, malicious actors.
Telegram’s policy fails to capture doxxing, misogynistic and racist posts that implicitly call for violence or retaliation against an individual and abuse that may happen in closed groups.
185 of the 220 abusive posts identified through the qualitative study remained live on social media platforms for at least six weeks. 100 (79%) of 126 posts on Telegram violated Telegram’s Terms of Service. 71 (90%) of 79 Facebook posts violated Facebook’s Community Standards.
In some cases, abusive posts may be avoiding detection through coded language and use of gifs/moving images, memes or image filters. 42% of qualitatively analysed abusive posts on Facebook had features that make moderation difficult without specialist monitoring.
Summary of Recommendations
Social Media Platform Accountability
Commit to a collaborative, specialist and survivor-led approach to tackling online abuse. This will involve dedicating more resources to monitoring Myanmar-language content, in consultation with Myanmar women’s rights organisations and survivors with insight into the evolving nature and impact of the abuse. It would also require platforms to make data accessible to affected communities so that they can work with platforms to track abuse and the effectiveness of countermeasures.
Review existing policies and platform features to encompass types of abuse not currently captured in their terms of use which may aid in the doxing of women.
Improve response times to threats reported on the platform to reduce the time it takes to remove abusive accounts following a report(s) of threatening activity.
Survivor Support
Dedicate resources to women and women’s groups providing support to online abuse survivors and campaigning to increase awareness about the problem of gender-based violence on and offline.
Support efforts that help destigmatise the topic of abuse through partnerships with Myanmar media, NGOs, and support groups.
Avenues for Future Research
Proactively monitor public Telegram and other channels and groups engaging in doxxing to better understand the relationship between online and offline abuse and the impact of doxxing.
Appoint specialist teams to proactively document sexual abuse online, including potential cases of child sexual exploitation and human trafficking which co-exist in spaces dedicated to sexual abuse online.
Expand data collection of abusive posts to enable robust comparisons between different groups of women.
Work with research practitioners, policymakers, NGOs and first responders to further develop a framework for ethical online research.
To read the full report, download the PDF with the button below. If you are experiencing difficulty downloading the full report, please email social@myanmarwitness.org to request a copy by email.