Deleting X: Why Sigdoc Left the Platform

1 day ago 5

How we decided to delete our X account.

By Morgan C. Banville

Posted May 30 2025

exit icon next to smartphone with X logo

As the Social Media and Communications Manager for the Special Interest Group on Design of Communication of the Association for Computing Machinery (ACM SIGDOC), I felt it necessary to re-evaluate which social media/communication platforms are representative of our membership. I have been in the role since December 2023, and like any “good” designer might do, approached making a decision with our members, rather than for them.

ACM SIGDOC brings together professionals and researchers who shape how people interact with digital systems and information. We’re a community of user experience (UX) designers, technical communicators, content strategists, and educators—all focused on making communication clearer and more human-centered. From research to practical application, SIGDOC supports our members with a yearly conference, a peer-reviewed newsletter (Communication Design Quarterly), yearly cutting-edge proceedings, and online workshops (launching Fall 2025).

So . . . how did we go about deciding to delete our X account?

To gather member feedback, I distributed a survey through our listserv in March 2025. Members wrote some of the following feedback in response to a question asking, “Should SIGDOC leave any of our current social media platforms? If so, please share which one(s) and why.”

  • “X and Facebook — privacy/AI training issues.”
  • “X as I no longer find it worthwhile to scroll through page after page of figurative trash to find useful information.”
  • “I don’t use these platforms, so easy for me to say but I would support SIGDOC in rejecting X and Facebook on the basis that they embrace communication design that is antithetical to our values.”
  • “X—I don’t think the current editorial policies share the same goals and values of SIGDOC.”

Please note that these are selected responses and are not an exhaustive list of the entire set of responses. The themes are largely the same, though: X does not embrace communication design, privacy, and trust. For example, the November 15, 2024, update to the X terms of service had a severe impact on user privacy. To summarize, if you use the platform, all users must agree to let their posts be used for training by artificial intelligence, including X’s own service, Grok. Prior to the update, it was possible to opt out from this agreement. Highlighted below is the language used in the terms of service:

…For clarity, these rights include, for example, curating, transforming, and translating. This license authorizes us to make your Content available to the rest of the world and to let others do the same. You agree that this license includes the right for us to (i) analyze text and other information you provide and to otherwise provide, promote, and improve the Services, including, for example, for use with and training of our machine learning and artificial intelligence models, whether generative or another type; and (ii) to make Content submitted to or through the Services available to other companies, organizations…

The revision to the terms of service not only limits users’ ability to choose (opt out), but it also reinforces the rise in social media platforms using user data for profit, and without informed consent.

The X platform has become an echo chamber for disinformation, hate speech, amplifying division, conspiracy theories, and more (see Reuters, UC Berkeley, and mores). Other than the toxic rhetoric, there are also many other organizations that have made the “great” exodus: Forbes wrote that Best Buy, Target, and NPR have quit X. Meanwhile, brands such as Disney, Apple, and IBM paused their ads on X after they were found running alongside extremist content, according to CNN. Such extremist content rose in numbers after the results of the 2024 U.S. Presidential Election; the content also led to an increasing concern about the “block feature” being “removed” (for lack of a better word) from the platform. With the changed feature, blocked users can still view the posts and profile page of the user that blocked them. However, blocked users can’t follow, direct message, interact with, or reply to those users. This change ultimately contradicts what the “block” feature is intended to do (and protect).

So, why did SIGDOC leave X? My short answer is in the form of a question (classic rhetorician): Why as an organization would we make the choice to remain on a social media platform that is the very antithesis of what we embody and hope to promote?

If you or your organization would like to remove your account, I recommend following the steps outlined by James and Quinlan (2025). Furthermore, if, like SIGDOC, your organization cares about privacy, ethics, communication design, and people, then I also recommend beginning the process of listening to your members as well as any additional stakeholders (through soliciting feedback) to explore which communication approaches align with your goals.

Morgan C. Banville

Morgan C. Banville is an Assistant Professor of Humanities at Massachusetts Maritime Academy. Her research areas are defined by the intersection of technical communication and surveillance studies, often informed by feminist methodologies.

Submit an Article to CACM

CACM welcomes unsolicited submissions on topics of relevance and value to the computing community.

You Just Read

Deleting X: Why SIGDOC Left the Platform

© 2025 Copyright held by the owner/author(s).

Read Entire Article