Human Rights in China (HRIC) has partnered with Ranking Digital Rights and ARTICLE 19 to conduct a Community Led Assessment of Rights Impacts in the Tech Industry (CLARITI) on X, formerly known as Twitter, Inc. The CLARITI methodology, available for use at claritihria.net, provides civil society with a structured tool to assess tech companies’ human rights practices in line with the UN Guiding Principles.
As one of the predominant online platforms for open and uncensored Chinese speech, X’s enforcement of its moderation policies has a significant impact on Chinese communities in diaspora and online. HRIC has found that consistent under- and over-enforcement of moderation policies, exacerbated by AI, as well as inconsistent identity verification policies, have had a disproportionately negative impact on Chinese human rights defenders, members of persecuted communities, and all those who seek to express themselves freely outside of China’s Great Firewall. In particular, account suspensions, phishing, harassment, and other restrictions as a result of X’s actions or lack thereof have prevented these individuals and groups from exercising their rights to free speech, access to information, and freedom of association.
Read more below, or download the full report here:
Full Report - Community Led Human Rights Impact Assessment on X (Twitter)
Human Rights in China (HRIC) is a nongovernmental organization founded in March 1989 by overseas Chinese students and scientists. HRIC’s mission is to support and strengthen domestic civil society actors through the advancement of international human rights and the institutional protection of these rights in the People’s Republic of China, including Hong Kong Special Administrative Region (HKSAR) and overseas Chinese in diaspora.
For this project, HRIC worked closely with human rights defenders and dissidents working on China issues who have been and are using X (formerly known as Twitter) as a mode of communication while circumventing the Great Firewall of China. For the purposes of this assessment, this target group of X users will be referred to as “rightsholders.” Mediums like X allow rightsholders to access and share information on the Internet which may otherwise be censored on Chinese news sites and social media. It is also a key platform for facilitating free expression and communication among the Chinese-speaking community, including around sensitive topics such as human rights.
In undertaking the assessment, we addressed the following key concerns: 1) X’s content moderation policies and their enforcement, which is overly reliant on AI and has resulted in under- and over- moderation of content, in turn leading to arbitrary account suspensions; and 2) X’s inconsistent verification system, including the recently revamped Blue Checkmark that allows impersonation and misinformation campaigns, such as coordinated spam, harassment, and bots, to flourish. These issues significantly hinder rightsholders’ ability to use X to express their opinions freely, share information on crucial human rights issues happening within the mainland that may otherwise never reach a global audience, and simultaneously access important information that would be censored by the Great Firewall of China. These activities are protected by Article 19 of the Universal Declaration of Human Rights, which declares that “[e]veryone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.”
The importance of X for Chinese human rights defenders cannot be over-estimated. There are no real alternatives to X for these users, since other Chinese messaging platforms and social media apps like WeChat or Weibo are heavily censored, surveilled, or government-linked, and non-Chinese alternatives like Mastodon or Bluesky do not have significant usage or reach. X’s arbitrary and non-transparent decisions about content moderation and account suspension create far-reaching consequences for China-based users’ rights and safety, especially their right to freedom of expression. This results in defender communities that are already marginalized in China being disproportionately affected.
For this HRIA, HRIC used the CLARITI (Community-Led Assessment of Rights Impacts in the Technology Industry) methodology to conduct an assessment of X (formerly Twitter). The methodology was developed by Ranking Digital Rights in 2023 with support from ARTICLE 19 under the Engaging Tech for Internet Freedom Initiative (ETIF). The assessment is intended to address salient issues experienced by rightsholders, such as arbitrary account suspensions, coordinated spam and attacks, and problems with the Blue Checkmark i.e. lack of transparency of verification, as well as impersonation efforts via Blue Checkmark accounts. The assessment would be the start of an important process and conversation with X to address and identify key issues, and hopefully enable rightsholders to be able to continue to access X freely and without any encumbrances in the long run, in turn ensuring their right to freedom of expression (including the right of access to information), and right to privacy.
This HRIA assesses X’s content moderation and content visibility. Our target country is mainland China: though X remains banned and does not have an actual office presence in the country, the app is still in use by rightsholders in the mainland. Rightsholders have been using the platform through Virtual Private Networks (VPNs). The timeframe of the assessment is July to December 2024.
The CLARITI methodology applies the International Bill of Human Rights as its baseline to define human rights and the UN Guiding Principles on Business and Human Rights (UNGP) as a guideline to assess the content moderation systems and practices of the company. Other legal and non-legal requirements, such as the Global Network Initiative (GNI) Principles, the Santa Clara Principles, and the EU General Data Protection Regulation (EU GDPR), are also applied in the analysis of soft law guidance available regarding how tech companies can respect human rights, as well as the broader legal and regulatory context in which the company operates.
In addition to undertaking extensive secondary research (see Appendix A), HRIC directly consulted 20 rightsholders. These interviewees are prolific Chinese human rights defenders and dissidents, based inside and outside of mainland China (in diaspora), including grassroots groups, lawyers, journalists, students, and other civil society actors. We also consulted another stakeholder, an ex-Twitter employee with direct familiarity with X’s human rights processes, to gain better insights on the technical, legal, and human rights issues with regards to X’s operations and influence in mainland China.
A human rights impact assessment was conducted in line with UNGP Articles 12, 13, 14, 18, 19, 23, and 24, and impact assessment best practices. This assessment highlighted the following impacts and causes which must be addressed by X:
There is no public information on what steps X is taking to address the impact of these crucial issues. Further, X does not have a dedicated human rights unit to address this impact, with evidence showing that X has deprioritized human rights in its logos, pathos, and ethos overall, such as disbanding its entire Trust and Safety Council in December 2022.
We make the following recommendations to X to uphold its responsibility to respect human rights and mitigate the adverse human rights impacts identified above:
We look forward to collaborating with X in order to enhance its protection, respect, and remedy of adverse human rights impacts on its users in China and the diaspora.
Contact: [email protected] or [email protected].