Harmonizing the Cacophony: An Affordance-aware Framework of Audio-Based Social Platform Moderation

Clubhouse is an audio-based social platform that launched in April 2020 and rose to popularity amidst the global COVID-19 pandemic. Unlike other platforms such as Discord, Clubhouse is entirely audio-based, and is not organized by specific communities. Following Clubhouse's surge in popularity, there has been a rise in the development of other audio-based platforms, as well as the inclusion of audio-calling features to existing platforms. In this paper, we present a framework (MIC) for analyzing audio-based social platforms that accounts for unique platform affordances, the challenges they provide to both users and moderators, and how these affordances relate to one another using MIC diagrams. Next, we demonstrate how to apply the framework to preexisting audio-based platforms and Clubhouse, highlighting key similarities and differences in affordances across these platforms. Using MIC as a lens to examine observational data from Clubhouse members we uncover user perceptions and challenges in moderating audio on the platform.



There are no comments yet.


page 8

page 9

page 15

page 29


Characterizing and Comparing COVID-19 Misinformation Across Languages, Countries and Platforms

Misinformation/disinformation about COVID-19 has been rampant on social ...

PlugSonic: a web- and mobile-based platform for binaural audio and sonic narratives

PlugSonic is a suite of web- and mobile-based applications for the curat...

I call BS: Fraud Detection in Crowdfunding Campaigns

Donations to charity-based crowdfunding environments have been on the ri...

The COVID-19 Infodemic: Twitter versus Facebook

The global spread of the novel coronavirus is affected by the spread of ...

WAAW Csound

This paper describes Web Assembly Audio Worklet (WAAW) Csound, one of th...

Designing for Multiple Centers of Power: A Taxonomy of Multi-level Governance in Online Social Platforms

Many have criticized the centralized and unaccountable governance of pro...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1. Introduction

In March of 2020, the global COVID-19 pandemic forced people to self-isolate, work from home, and limit in-person interactions all together. Amidst the isolating times of the ongoing global pandemic, a new audio-only social media application called Clubhouse surged into the mainstream (solans2020rise). Clubhouse was launched in March 2020 and described as a “drop in chatting” platform, where users could participate in group audio conversations. Despite being invite-only and iOS-only (up until May 2021), Clubhouse garnered over 10 million users since its launch (erprose_2021). Platforms for audio conversations have existed prior to Clubhouse’s creation. Many video games have supported in-game voice chat since the early 2000s (loguidice2014vintage), and applications such as WhatsApp111https://www.whatsapp.com/ and Discord222https://discord.com/ have supported audio communication since their conception, along with text-based communications. Clubhouse, on the other hand, is an audio-only platform, and provides no way besides voice for its users to communicate inside the platform.

Figure 1. A timeline of popular audio-based technologies and social platforms. Clubhouse appears to mark the beginning of an audio-based “boom” in platform development

1.1. The Rise of Audio-based Social Platforms

Clubhouse’s popularity was accompanied by a rise in other audio-focused platforms as well as extensions on existing platforms (radcliffe2021audio). Twitter began beta-testing an audio-group chat feature called Twitter Spaces back in November of 2020 and officially made this feature available to all users in May 2021 (soni_2021). Facebook announced the development of a similar feature in the Spring of 2021, with plans to launch sometime in the Summer of 2021 (pardes). Spotify acquired the parent company of an audio-only sport-centered app called Locker Room in March 2021 (spotify_2021; steele_2021), and re-branded and re-launched it as Spotify Greenroom two months later (carman_2021). Less in the mainstream, another voice-chatting app called Sonar was launched in January 2021 (sonar_2021). Other popular platforms such as Reddit (peters_2021), Telegram (wilson_2021), Slack (sarwar_2021), and Discord (tech2_2021) have announced Clubhouse-esque features and affordances to allow for live public audio rooms.

Though these newer platforms all utilize audio as their primary medium for content and communication, they all offer distinct types of affordances. Clubhouse is audio-only, while Spotify Greenroom incorporates live text-based chat boxes in each audioroom. Twitter Spaces is built into Twitter, a text-based social media platform where tweets are not always organized by topic-specific chats. Voice rooms on Clubhouse, Spotify Greenroom, and Twitter Spaces are only used for communication, whereas Sonar allows users to build 2D worlds in voice rooms while communicating with others. Discord separates communities using “servers,” but Clubhouse, Spotify Greenroom, Twitter Spaces, and Sonar all allow users to discover voice rooms from communities that they are not necessarily members of.

While these audio-based platforms may appear new and novel, their affordances are often reminiscent of affordances of older or more established technologies. The drop-in audio aspect of Clubhouse, Spotify Greenroom, and Twitter Spaces are reminiscent of the party lines from the late 19th century (channel). Spotify Greenroom’s live chat is reminiscent of the live chat on Twitch.333https://www.twitch.tv/p/en/about/ Sonar’s world-building aspect resembles classic virtual world building games such as Minecraft.444https://www.minecraft.net/en-us Both Spotify555https://www.spotify.com/ and Soundcloud666https://soundcloud.com/ host audio content like music and podcasts and have been around since 2010.

1.2. Challenges in Moderating New Technologies and Non-textual Content

Understanding the challenges and developing tools for facilitating online moderation has been the subject of a large breadth of research (kiesler2012regulating; jhaver18; myers2018censored; gilbert2020run; seering17; seering2019moderator; roberts2016commercial; lampe04; jiang19; kiene16; kiene19; schlesinger17; chandrasekharan2017you). The majority of moderation-related research focuses on moderating textual content. Since posts and comments supported by most platforms are text-based, creating automated NLP-based tools to help human moderators flag abusive posts have become the norm (e.g., (chandrasekharan2019crossmod)). Furthermore, textual content allows for the development of datasets documenting abusive and antisocial behavior that help the research community better understand and design for challenges in moderating text (aprosio2020creating; chancellor16). More recent research focuses on moderating platforms that utilize audio (jiang19; kiene19). However, understanding the best practices and challenges in moderating online audio requires further exploration.

Grimmelmann’s taxonomy, an important piece in moderation literature, lists four basic techniques that moderators can use to govern their communities—exclusion, pricing, organizing, and norm-setting (grimm). Grimmelmann’s work defines the purpose of moderation and provides general terminology to study moderation. But Grimmelmann’s taxonomy does not account for the nature of platforms a community utilized (i.e., its affordances) or even the medium of communication–all of which are key factors affecting how online communities are moderated (jiang19; blackwell2019harassment; sabat2019hate; lamerichs2020user). Furthermore, when Jiang et al. (jiang19) studied the challenges that moderators face when tasked with moderating voice on Discord, they concluded that the nature of audio prevents “moderators from using the tools that are commonplace in text-based communities, and fundamentally changes current assumptions and understandings about moderation.”

Based on these observations and findings, it is clear that understanding moderation on a newer and more complex (i.e. not text-based) platform, such as Clubhouse, must begin with understanding the platform and its affordances.

1.3. Our Contributions

The contributions in this paper are three-fold. First, we present a novel framework that accounts for key platform-level affordances to represent audio-based social platforms (ABSPs). We define an ABSP as follows: A social networking site (SNS) is an ABSP if audio-based content (e.g., voice-based communication, audio-based posts, etc.) is among the primary types of content hosted on the SNS.777SNSs are defined as “web-based services that allow individuals to (1) construct a public or semi-public profile within a bounded system, (2) articulate a list of other users with whom they share a connection, and (3) view and traverse their list of connections and those made by others within the system” (boyd2007social).

Next, we use this framework as a lens to dynamically examine relationships between affordances, and how each affordance and its relationships with others play a role in moderation on the ABSP. Finally, we use observational data from community members to uncover unique challenges related to moderating audio on Clubhouse, and reflect on implications for governing ABSPs.

For CMC and CSCW theory, our framework provides a new analytic lens to identify and understand the various affordances of ABSPs and the relationships between these affordances. Using this framework, we can explore how these affordances and relationships play a role in the ABSPs at large, contribute to moderation challenges, and how they can be used to develop mechanisms for moderation. Our framework is not only useful for moderation researchers but also for stakeholders of ABSPs, like platform designers and moderation practitioners; the framework will allow them to represent the platform and use this representation to aid in the development of moderation tools or strategies. It also allows these stakeholders to adapt successful tools and strategies from other ABSPs that may have similar affordance and relationship ecosystems.

1.3.1. Affordance-Aware Framework for Representing ABSPs

Our framework, MIC, is made up of platform affordances that fall into three categories. These categories are derived from Grimmelmann’s (grimm) definition of an online community888Grimelmann defines online community broadly, and in doing so states that a platform in its entirety can be considered an online community; this is the view we will take of platforms.Members, Infrastructure, and Content. Since the affordances of a platform seldom function independently from one another, we also formalize a notion of inter-affordance relationships that allows us to examine how affordances affect one another, and how these relationships can affect moderation.

These components of MIC will create a graph-like representation (MIC diagram) of a platform that shows its affordances and relationships. We describe MIC and its components in Sections 3.2 and 3.1.

1.3.2. Methodology for Investigating ABSPs

The MIC diagram now functions as a dynamic tool for studying moderation on ABSPs. More specifically, we can re-frame moderation research for an ABSP as annotating and updating its MIC diagram with new research findings, observations, or platform updates. Annotations and updates to the MIC diagram could involve changing classifications of affordances, changing or adding relationships, or highlighting affordances to indicate their importance. We can also use MIC diagrams to easily compare ABSPs, which could help us formulate questions to guide research studies.

Our proposed methodology for using MIC diagrams is shown in Figure 2. A description of how to update MIC diagrams can be found in Section 3.3. We will explicitly demonstrate how to use this methodology in the next contribution.

Figure 2. A high-level outline of our proposed methodology for investigating moderation on ABSPs using MIC. We will use this methodology for investigating moderation on Clubhouse starting in Section 4.

1.3.3. Investigating Moderation on Clubhouse

We use MIC and our proposed methodology to understand moderation on the Clubhouse app. In Section 4.1 we begin to use MIC and our methodology by creating an MIC diagram for Clubhouse using observations. In Section 4.2, we report some participatory data, and in Section 4.3 we compare Clubhouse’s MIC diagram to that of Discord and Spotify. In Section 5 we use this data and comparisons to formulate the following two research questions:

  • [label=]

  • RQ1. How do users perceive moderation on Clubhouse, and what insights does this give us about the platform’s MIC affordances and relationships?

  • RQ2: What types of antisocial behavior occur on Clubhouse, and what insights does this give us about the platform’s MIC affordances and relationships?

We also describe the qualitative analysis approach we will use to answer these questions in Section 5. In both Sections 7 and 6 we execute this analysis to answer and update Clubhouse’s MIC diagram accordingly.

2. Background

Before detailing our framework for representing ABSPs, we introduce the platform affordances that we account for in MIC and review related work that motivated each of these affordances. First, we describe the high-level structure of these affordances, which was inspired by Grimmelmann’s work (grimm).

Grimmelmann defines an online community using three features: the community’s members, the content that is shared among the members, and the infrastructure used to share it (grimm). We use these features to motivate the three main categories for affordances that we include in our MIC framework. Now we discuss how each of these categories impacts the four basic techniques for moderation listed by Grimmelmann. Exclusion is the act of excluding problematic or unwanted members from the community. Another closely related technique is pricing, which controls the participation of community members by introducing barriers to entry. Both exclusion and pricing are mandated by the infrastructure and members of the community: infrastructure provides the tools for exclusion or pricing, while members are involved in using these tools. Organizing is a technique that involves “shaping the flow of content from authors.” This technique is closely tied to the nature of content within the community. It is also tied to infrastructure and the type of “shaping” capabilities that are provided to the members of the community. Finally, the fourth technique listed by Grimmelmann is norm-setting, which involves the creation and articulation of community norms to establish the types of behavior that are acceptable within the community. Norm-setting can be done through the other techniques, and is therefore impacted by all three categories of community features and affordances.

Next, we discuss each category of affordances included in our framework and review related work examining these affordances, with a particular emphasis on research related to moderation and ABSPs.

2.1. Member-related Affordances of ABSPs

Through interviews with volunteer moderators of Discord servers, Jiang et al. (jiang19) found that server owners create custom user roles. The moderator role is a common facet of online communities and a role that is often assumed by volunteers on platforms relying on distributed moderation (seering2019moderator; wohn2019volunteer; jiang19; gilbert2020run). Each role can be assigned specific permissions and thereby limit the access of certain users and to certain channels on the server. User roles and access are thus closely related, and constitute two of three member-related components in our framework.

The third member-related component in our framework is anonymity. Similar to ephemerality, anonymity has been explored in a variety of other contexts. Schlesinger et al. studied how anonymity affects content on Yik Yak, a social media application that allowed users to make anonymous text posts that are grouped by location (schlesinger17) . In general, anonymity has been found to have both positive and negative effects on social interactions (christopherson07). Outside the context of online social spaces, anonymity was found to remove status markers that prevent members from participating in discussions on collaborative systems (Hayne97; mcleod1997comprehensive; weisband1993overcoming). Prior work examining the role anonymous voice-based interactions in online games found that in some cases anonymity was lost due to the nature of voice-based communication, and this caused some players to feel uncomfortable (wadley2015voice). In fact, this loss of anonymity was deemed as one of the main reasons behind gamers abandoning the game being studied.

2.2. Infrastructure-related Affordances of ABSPs

The main infrastructural component in our framework considers the various modalities of a platform, i.e., whether audio is the only major type of content, or if there are other types of content. Modalities often impact the how a platform is structured, i.e. how users and subcommunities of the platform are situated. On Twitch, text-chats are associated to specific live streams, and live streams are separated by different Twitch channels; different channels have different moderators. Seering et al. studied how moderators encourage pro-social and discourage anti-social behavior in the text-chat of Twitch, a live video streaming platform (seering17). Discord also has more than one modality that the platform supports—text and audio channels. In certain cases, the lack of certain modalities and organizational structures within platforms might force community members to use other platforms to overcome these deficiencies. This type of inter-platform dependence can be seen in Kiene et al.’s (kiene19) work studying how moderators of Reddit communities use both Reddit and Discord to host their communities and the resulting challenges moderators have to tackle in doing so.

Other integral parts of the infrastructure of ABSPs include the rules and guidelines of platforms and the communities they host. Prior work has examined the rules that moderators of both Reddit and Discord outline for their communities, as well as guidelines specified by the platform itself (kiene19; jiang19). Rules and guidelines, both community-defined and platform-specified, often describe the different roles members can play within the community (e.g., both Discord and Reddit have pages dedicated to defining what the role of a moderator entails). Rules and guidelines have also been shown to shape community norms (kiene16; cialdini1998social; triandis1994culture). Platforms also have different signalling or marking mechanisms, such as emojis to react or up- and down-vote content. In the context of ABSPs, markers can provide relevant cues to indicate whether a user wishes to speak or not (a challenge that is often characteristic of video-based or voice-based communication (isaacs1994video; olson1995mix)).

Our final infrastructural component represents moderation purview, i.e. how much of the audio-based content generated within a community can be moderated—either by human or automated moderators. Jiang et al. found that one of the biggest struggles when moderating Discord voice-channels is collecting evidence to prove that antisocial members were indeed engaging in antisocial behavior. The main reason behind this challenge was limited moderator time and availability—moderators were unable to be present in voice-channels at all times of use (jiang19). Even when content is of other modalities and is non-ephemeral (i.e. text posts and comments on Reddit), human moderators may not be able to review every piece of content due to similar restrictions (chandrasekharan2019crossmod). Prior work has explored how volunteer moderators employ a variety of mechanisms for moderating content, and moderation typically involves a large amount of time and effort to keep up with the massive amounts of content generated within social platforms (matias2019civic; kiene19). As a result automated and human-machine collaboration tools are being developed to assist moderators on text-based platforms like Reddit (jhaver2019human; chandrasekharan2019crossmod). Video-hosting platforms like YouTube use algorithmic moderation that allows them to have a larger moderation purview without burdening human moderators (gorwa2020algorithmic; roberts2016commercial).

2.3. Content-related Affordances of ABSPs

Much of audio-based communication occurs in real-time. This has always been the case with voice-communication over telephone and is a common theme of audio-based communication that occurs in group voice-chats for gaming (ackerman97; tang09; wadley2015voice). Ackerman et al. (ackerman97) studied how users viewed and used Thunderwire, a collaborative audio-only real-time communication system modeled after telephone “party lines” of the late 19th century. Wadley et al. (wadley2015voice) studied real-time audio-communication in online multiplayer games and virtual worlds during game play. There has been research done on voice-based communities from India that use asynchronous audio for communication (patel10; vashistha15). From these works, it is clear that the synchronicity of audio content is a defining characteristic of various ABSPs and affects moderation capabilities.

A closely related feature of audio that is closely tied to synchronicity is ephemerality. Ephemeral audio is often a consequence of synchronous or real-time audio. Both communities studied by (ackerman97) and (wadley2015voice) used ephemeral content. Prior work on ephemerality in social platforms has largely focused on ephemerality of text posts, links or images (schlesinger17; bernstein20114chan; Xu16). Jiang et al. studied the challenges of moderating voice on Discord and found that the ephemerality of audio-based content was a large factor that contributed to the challenges that moderators face (jiang19).

3. MIC: A framework for representing ABSPs

In this section, we formalize MIC which comprises two types of components: affordances and relationships between them. Affordances are properties of ABSPs that can be used to represent platforms and play a role in moderation. We have identified three categories of affordances related to members, content and infrastructure. Together, these components can be used to create MIC diagrams (see Figures 4 and 3) to represent our current understanding of certain ABSPs. These MIC diagrams can be updated to reflect new findings or information about affordances and relationships on these ABSPs, as well as investigate sources of challenges or potential strategies for moderation on ABSPs.

We will use Discord, Spotify, and occasionally Soundcloud as working examples of ABSPs to help us describe affordances and relationships. The affordance classifications and relationships from these examples were formed using participatory observations provided by the first author999The first author is an avid and experienced user of all three platforms; they also have both types of creator accounts for Spotify, and therefore is able to experience all types of features for each platform., as well as some prior work. We will also construct MIC diagrams to represent Spotify (Figure 3) and Discord (Figure 4) through our framework. Using insights from prior research on Discord moderation (jiang19), we will annotate the MIC diagram for Discord. We provide high-level descriptions of the ABSPs we will be using as working examples in this section below.

  • [label=]

  • Discord. A messaging platform that allow users to communicate via text, voice, or video. The audio on Discord is the voice-based communication that occurs in voice channels. Discord’s infrastructure is composed of “servers,” which can be thought of as landing pages for individual communities that use the platform. Servers can contain topic specific text-channels or voice/video channels. Server owners can create custom roles for server members, and can associate specific permissions for each role.

  • Spotify. A audio-streaming service that hosts both music and podcasts. Spotify’s audio consists of music, podcasts, and other types of voice messages that only specific types of users can upload. The main two types of Spotify users are listeners (those who use the service to stream content) and creators (those who use the service to upload content). Listeners are able to follow both creators and other listeners, and can view the latter’s playlists and listening history. Creators must use other Spotify services, such as Spotify For Artists101010https://artists.spotify.com/ for musicians and Anchor111111https://anchor.fm for podcasters.

  • SoundCloud. A music-sharing website that allows all users to post audio (which consists of music, podcasts, random noises, etc). Users are also able to comment on audio files and re-post others’ audio posts on to their feed.

Figure 3. MIC diagram for Spotify constructed using observations made by the first author. Key for understanding arrows and abbreviations can be found in Table 1.
Figure 4. MIC diagram for Discord constructed using observations from first author and Jiang et al (jiang19). Annotations represent additional findings from (jiang19).

3.1. MIC Affordances of ABSPs

For each affordance, we describe what the affordance is and variations of each affordance. We will also discuss how some of these affordance could play a role in moderation on platforms.

Synchronicity (Sy)

Synchronicity refers to whether or not the content is created in real-time. Audio on Discord is synchronous, while audio on Spotify is asynchronous. Synchronous audio-content can often disallow moderation, since not all moderators or moderation mechanisms can be present at the time audio is being produced and shared. When audio-based content is asynchronous, it provides a larger window of opportunity for moderation mechanisms to detect and report antisocial behavior.

Ephemerality (Ep)

Ephemerality refers to whether or not the content must be consumed in real-time. On Discord, audio is ephemeral, since recording voice-channels can violate Discord’s Terms of Service (TOS). On Spotify, audio is not ephemeral. Studies have shown that users behave differently when interactions are ephemeral and leave no record or trace (bernstein20114chan; schlesinger17). Furthermore, when audio-content is ephemeral, it becomes difficult for moderators to collect robust evidence to prove that anti-social behavior occurred to remove bad actors (jiang19).

User Roles (Ur)

ABSPs may distinguish between types of user roles, and may even have designated roles that allow users to act as moderators. On Discord, server owners and administrators can create custom roles for users, and one such role is typically assigned to “Moderators”. On Spotify, only users with Spotify for Artist accounts are able to publish music—these users are typically musicians and artists. All users are able to create Anchor accounts to publish podcasts. Spotify has no designated “Moderator”-like role assigned to users on the platform.

Access (Ac)

Access refers to the types of permissions and restrictions users have for creating and consuming audio-based content on the ABSP. On Discord, though individual servers and channels might prohibit or limit the access provided to users, all users are able to create their own voice channels, thereby allowing free access to the creation and consumption of audio content. Spotify allows all users to consume audio, but only allows creators (musicians or podcasters) to create and publicly post audio-content. Since Anchor is a free service for users who wish to become podcasters, there is no restriction to post podcasts. However, users cannot publish music to Spotify directly–they must use a music distributor. Popular musicians are often signed to record companies or labels that will either act as or employ a distributor. Independent artists, those who do not have the backing of a record company, can use online music distribution services like DistroKid121212https://distrokid.com/ to publish music on Spotify. These services are never free, and therefore access to publishing music on Spotify is restricted. SoundCloud, on the other hand, allows all of its users to post audio-content, and only limits the amount of audio-content a free user can upload before requiring a paid SoundCloud Pro account. The types of barriers to access on Spotify and SoundCloud are examples of the pricing moderation technique outlined by Grimmelmann (grimm).

Anonymity (An)

Users on ABSPs may be anonymous or use pseudonymous usernames to mask their identity. On Discord, users typically adopt usernames or handles that are custom and/or pseudonyms. So, users in voice-channels might not be not associated with any actual means of identification. On Spotify, listeners can, and often do, create account usernames with their actual identity (typically by linking Spotify to their Facebook account). However, some users do adopt custom usernames that obscure their identity. Creators may publish audio-content under stage names. Anonymity has been found to both enable and discourage negative behavior in online social spaces (Hayne97), and anonymity appears to break down when using voice-based communication (wadley2015voice).

Modalities (Md)

An ABSP can be unimodal, and only support audio as its primary medium for content, or mutlimodal and allow for other types of content. Discord is multimodal since servers can contain text-channels which allow users to post text-based messages. Discord also allows users to use video along with audio inside voice-channels. Spotify is unimodal since audio is the only type of content supported by the platform. The existence of other modalities (besides audio) on an ABSP will affect moderation on the platform, since having more than one modality typically requires a broader set of policies and tools for moderation (matias2019civic; jiang19; kiene19).

Structure (St)

The structure of an ABSP refers to the way in which audio and other content is organized, situated, and discovered on the platform. Audio on Discord is situated inside voice-channels which exist within servers. Users can use Discord’s Server Discovery feature or Explore page to look for popular public servers to join, or create their own public or private servers. Not all large servers are necessarily public or searchable using Discord’s Server Discovery. In these ways, the audio-based interactions that occur on Discord are much more confined, since they are situated inside designated servers, most of which are not easily discoverable or indexed properly. The vast majority of audio-content on Spotify is indexed and publicly available to every user of the service.131313The only exception is the Voice Messages that listeners can send to podcast creators who use Anchor. Typically, audio on spotify is organized by artist, genre, podcast, or in user- or algorithmicly-curated playlists (some of which are private). Users can search and discover all public audio-content via search or using Spotify’s various discovery and recommendation mechanisms. In this way, Spotify is more free range compared to Discord.

Rules and Guidelines (Rg)

Most ABSPs have some combination platform-wide terms of service (TOS) and community-specific guidelines to govern user behavior. These terms and guidelines establish high-level rules that all users are expected to abide by. In addition to community guidelines and TOS, Discord also has platform-level rules that clearly define the roles of moderators on servers. At the community-level, Discord servers can publish their own set of rules and guidelines that are typically more tailored to the type of community the server hosts. Spotify has separate guidelines and TOS for listeners and content creators who use Spotify for Artists and Anchor. The rules and guidelines of ABSPs help establish a baseline for both platform-wide and community-specific norms and conditions for exclusion (e.g., suspensions or bans (chandrasekharan2017you)). Rules and guidelines play a key role in moderation, as seen in Grimmelmann’s work—norm-setting and exclusion make up two of the four common techniques for moderation (grimm).

Signals and Markers (Sm)

Signals and markers refer to the various types of visual cues or indicators that could be applied to audio-content and users on an ABSP. On Discord, different user roles can have different colors associated with them. For example, if a “moderator” role is associated with the color red on a Discord server, we know that a user’s handle (i.e., username) appearing in red indicates that the user is a moderator. Such markers help other members identify the official moderators of a server, and depending on what other roles the server defines, could help identify different types of users. Discord also provides indicators that show whether participants of a voice call have their mic muted or their video on; this information can be seen without having to actually join the voice-call. On Spotify, artists can have a verified blue-check on their profile which indicates that the identity of the owner of the artist page has been officially verified by Spotify. This signal indicates to users that the content posted on this artist’s page is coming from an official source. Spotify also displays the number of times a song has been listened to and the number of users who have liked a playlist. Such signals and markers help moderation by allowing users to determine if the audio-content is popular or verified before choosing to engage. Signals and markers also help distinguish between certain types of users and moderators, and determine the credibility of the audio-content’s source.

Inter-Platform Dependence (Ipd)

The way users of one social platform (audio-based or otherwise) utilize other platforms is an aspect that is often overlooked when discussing moderation on SNSs in general. We say that an ABSP is dependent on other SNSs if it is more common for community members to use other platforms alongside the ABSP, rather than having the ABSP as the sole SNS in the community’s ecosystem. Discord is minimally dependent on other SNSs. Discord servers are known to be used alongside other platforms (such as Reddit (kiene19)), but are also commonly used alone. Discord users will occasionally use other, more free-range platforms such as Twitter and Reddit to discover and advertise private servers. Spotify, on the other hand, is often used alongside other platforms to embed music. For instance, Instagram users can add music directly from Spotify to their story posts, or link to their Spotify playlists. In this way, much of the non-audio content pertaining to audio-content from Spotify exists outside of Spotify, on other SNSs. As more SNSs become available, online communities may begin to use more than one SNS to host their communities. This affects moderation since bad actors may begin to use more than one SNS to engage in harassment against individuals, making moderation more difficult due to the cross-platform nature(jhaver18).

Moderation Purview (Mp)

The moderation purview on ABSPs refers to how much of the audio-content is moderated. On Discord, not all audio is moderated since moderator presence is not a prerequisite to generating audio-content. For example, audio interactions within voice-channels do not need to occur only in the presence of a moderator. This means that the moderation purview of Discord is limited. On Spotify, all audio-content can be moderated, since audio must be first uploaded to the platform and processed before it is hosted publicly. Spotify has mechanisms for algorithmic content moderation,141414This is the case with moderating copyright-abiding content (brovig2021remix). and the existence of such mechanisms leads us to believe that all audio-content is moderated in some way. In this way, Spotify’s moderation purview seems151515Since we do not know the inner workings of Spotify’s design, we cannot claim this with absolute certainty. universal. Limited moderation purview could allow abusive and antisocial behavior to go unchecked on an ABSP, and thereby have a negative impact on communities (and their norms).

3.2. Relationships Between Affordances

Though we have defined a set of disjoint affordances, these affordances will often be linked to each other in the larger platform ecosystem. For instance, in both Spotify and Discord, access is directly linked to user roles, since different types of roles constitute different types of access. Inter-affordance relationships are important to represent and understand since any modifications to one affordance may several impact others. Moreover, if a specific affordance has been identified as a contributor to moderation challenges, we can use inter-affordance relationships to identify other affordances that also contribute to these challenges.

We define four characteristics of relationships. These characteristics represent the nature of the relationships (whether the relationship is causal or not), as well as their certainty (whether this relationship is clear/verified, or if it is yet to be verified). We define these relationships below, and provide examples for them. These relationships are also reflected in the MIC diagrams for Discord and Spotify (Figures 4 and 3).

Causal Relationship (, )

A causal relationship exists between affordance A and affordance B if A contributes to the cause of B, or if A enables B. For example, on Discord, the ephemerality of audio-content causes the moderation purview to be limited (EP MP). On Spotify, asynchronicity of audio communication enables its non-ephemerality (SY EP), since asynchronous audio must be posted in order to be listened to. In MIC diagrams, this relationship is shown using a directed arrow.

Corresponding Relationship (, )

A corresponding relationship exists between affordance A and affordance B if there is a correlation, or nearly one-to-one correspondence, between the two. On Discord, user roles have a corresponding relationship with certain markers (UR SM), since different user roles can be highlighted with different colors inside a server. Like we mentioned before, user roles on Spotify correspond to varying degrees of access (UR AC); this is also the case for Discord. In MIC diagrams, this relationship is shown using a bi-directional arrow.

Proven Relationship (, )

A corresponding or causal relationship is proven if its existence has been verified, either through irrefutable observations or via findings from research studies. For instance, causal relationships from synchronicity (SY MP) and ephemerality (mentioned in Causal subsection) to the limited moderation purview on Discord were established by prior research (jiang19). Proven relationships are shown using solid arrows in MIC diagrams.

Potential Relationship (, )

A corresponding or causal relationship is potential if it is not yet verified and cannot be directly observed. For example, on Discord, there is a potential causal relationship between the structure and the nature of the inter-platform dependencies of Discord (ST IPD), since using other platforms to advertise servers may be because there is no way of doing on the platform because of its structure. Most corresponding relationships are easily observable, so neither Spotify nor Discord had examples of potential corresponding relationships. However, these types of relationships may be exist on other ABSPs, or may develop as platforms evolve. Potential relationships are shown using dotted arrows in MIC diagrams.

3.2.1. Relationships in Spotify

In addition to the relationships mentioned above, we describe more relationships that are reflected in Figure 3. Spotify’s lack of other modalities could be the reason it is heavily integrated into other platforms (MD IPD); this relationship is only potential since SoundCloud has similar inter-platform dependencies, but has more than one modality. Spotify’s moderation purview is definitely caused by the non-ephemerality and asynchronicity of the audio (SY MP, EP MP); this is verified by Spotify’s user agreement which explicitly states that Spotify is allowed to remove or edit any uploaded content if it violates community guidelines (RG MP). Asynchronicity prohibts ephemerality (SY EP), since in order for audio to be asynchronous, it must be non-ephemeral.

3.2.2. Relationships in Discord

We now describe some more relationships between affordances in Discord; these are pictured in Figure 4. Discord is able to have unlimited custom roles for users because these roles are server-specific (ST UR). Audio on Discord is ephemeral because conversations in voice channels occur synchronously (SY EP).

3.3. Annotating and Updating MIC Diagrams

To investigate ABSPs using MIC, we can represent it using an MIC diagram. This diagram can then be updated or annotated to reflect new findings, updates, or proposals of and for the platform. Below, we describe the different types of updates and annotations. We use some annotations to reflect the findings and proposals of Jiang et al. (jiang19) in Figure 4. A helpful key of the affordances, relationships, and annotations can be found in Table 1.

Updating Affordances

We may find that some affordances we classified are incorrect. This could happen after uncovering information about the platform that was not clear before. This could also occur if a platform itself adds or removes features. For instance, if Spotify added a feature that allowed users to post text under comments (like SoundCloud), then we would change the modality affordance from unimodal to multimodal. Updating an affordance will require us to delete the relationships that involve that affordance.

Adding Relationships

If a platform has underlying relationships between affordances that are only revealed after more detailed analysis, these relationships can be added to the MIC diagram. This could also occur if we update an affordance, delete its former relationships, and establish new ones. Potential relationships can be changed to proven relationships or removed entirely, if it ends up being disproved.

Proposed Relationship Annotation

Findings from research may imply a need for platform-level changes. Some platform changes can be thought of as linking two affordances so that they correspond or that one effects the other. To reflect this in the MIC Diagram, we could add a blue version of a proven relationship. This would be a proposed relationship: a relationship that does not currently exist on the platform, but should since it could benefit the platform. This type of relationship serves as more of an annotation, since it is not a part of the platform (yet). If a platform is updated, these annotations could be converted to actual relationships.

For example, Jiang et al. (jiang19) suggest that platform-level terms of service or individual servers (via server rules) should acquire consent to record voice channels. We can represent this in Discord’s MIC diagram by adding a proposed corresponding relationship annotation between the affordances involved (RG EP) to indicate this change.

Highlighting Affordances for Development

Another platform level change could involve further development of a specific affordance. In addition to the aforementioned suggestion, Jiang et al. (jiang19) also propose that Discord be more explicit about the legality of recording in its terms of service and community guidelines, since currently there is no clarity as to whether Discord permits recording, or permits servers to require consent to record; this indicates the need for further development of the its rules. We can annotate Discord’s MIC diagram to reflect this using a bold blue outline (RG).

Highlighting Problematic Affordances

If an affordance is shown to create some type of moderation challenge, enable anti-social behavior, or prevent pro-social behavior, we can highlight it using a bold red outline. A concrete example of this can be found in the MIC diagram for Discord. The MP affordance and ephemerality affordance are highlighted in this way (MP, EP) to reflect findings from (jiang19). f an affordance could potentially be a problematic affordance, i.e. we do not yet have solid proof that it falls into this category, we can highlight it using a bold red dotted outline.

Component Type Component Name Diagram In-Line
Relationships Causal Directed Arrow
Proven Solid Arrow
Potential Dotted Arrow
Annotations Proposed Relationship Blue Solid Arrow
Development Highlight Blue Border MP
Problematic Highlight
Solid Red Border
Dotted Red Border
Updated Affordance MP
Synchronicity SY
Ephemerality EP
User Roles UR
Access AC
Anonymity AN
Modality MD
Structure ST
Rules and Guidelines RG
Signals and Markers SM
Moderation Purview MP
Table 1. Reference key for MIC components and their diagram and in-line representations.

4. Representing and Investigating Clubhouse using MIC

Figure 5. The initial MIC diagram for Clubhouse, created using observations and participatory findings from Sections 4.2 and 4.1.

In this section we start by describing the Clubhouse app and report some participatory data and observations in Section 4.2. In the process we will point out affordances and relationships and create an MIC diagram in Figure 5. In Section 4.3 we will compare the MIC diagram for Clubhouse to the MIC diagrams for Discord and Spotify.

4.1. Describing the Clubhouse App and Creating its MIC Diagram

Clubhouse (Clubhouse) is an audio-only SNS that was launched in March of 2020. It is self-described as a “Drop-in audio chat[ting]” platform (ch). Clubhouse was available as an iOS application since its launch, and later expanded to allow Android users in June of 2021 (erprose_2021).

4.1.1. Getting into Clubhouse

Clubhouse is an invite-only social platform, so new users must be invited to the app using their phone number (AC). Clubhouse users receive invites that they can distribute to their friends or people in their network. Clubhouse users can receive more invites to give out by being an active participant of the community, since the platforms claims to “allocate invites automatically based on contribution to the community.” For example, after the first three days of regular participation, one of the co-authors received four invites.

Clubhouse describes itself as a “real name service.” In fact, the first listed rule in its community guidelines (RG) section (ch-cg) states that users “must use a real name or identity on the service” (AN; RG AN). Furthermore, accounts are associated with phone numbers, and new users receive invites using their phone numbers. Phone numbers are also used to notify contacts of a user about their account on the app in order to build networks that are similar to the user’s existing network.

Figure 6. Navigating Clubhouse. (Left) The Clubhouse homepage that shows users active rooms hosted by clubs they follow, or random active public rooms. Users can see the name of the room, the name of the club hosting it, if one exists, names of some users that are in the room, the number of participants, and the number of users on the stage. (Right) The Explore Tab contains 14 topic categories that link to relevant topic-specific clubs.

4.1.2. Navigating Clubhouse

Clubhouse is an audio-only ABSP, i.e. unimodal (MD), where users are able to participate in voice calls directly with other users. Users are also able to participate in public or private group voice calls. These voice-calls are referred to as “rooms.” Rooms have titles that users can view to see what the topic of conversation in that room is. Users can join any public room that they find via the Home page, or using the Explore tab, which lists topic-specific pages and groups called “clubs” (as seen in Figure 6).

Clubs can be thought of as community-specific groups that can host and schedule rooms. Users can follow clubs to be notified of when rooms associated with that club start. Currently, only the “most active members of the Clubhouse community” are able to create clubs (AS) to associate rooms with (it is unclear, however, how this designation gets made, or what the threshold is to get this access). However, Clubhouse plans to eventually allow anyone to create public or private rooms(ch-kc).

Clubs can be private or public facing, and public clubs can be followed by any user. Users are allowed to join any public room that is started on the app (regardless of whether they are associated with the rooms’ respective club, or if the person who started the room is in their network), as well as private rooms that are started by their followers or started by a private club they follow. As such, Clubhouse has a free-range structure (ST).

4.1.3. Inside Clubhouse Rooms

Users can have one of three roles (UR) in a room on Clubhouse. The moderator role is given to the user who creates the room. This user has the ability to end the room, invite users to the stage to speak, mute speakers, and give the moderator role to others as well. This means that every active room (i.e., every instance that audio-content is generated on the app) has a moderator present (MP; UR MP).

All other users that enter the room start out as listeners, and do not have the ability to speak in this role—they cannot unmute their mic (AC; UR AC). As a listener, users can press the “raise hand” button and ask to be a speaker. If a moderator accepts a listener’s request to speak, that listener gets moved up to the “stage” where they now have the role of speaker. As a speaker, they can unmute their own mic and be heard by everyone else in the room. User roles and access are all room-specific, and this could be because of how Clubhouse is structured (ST UR; ST AC).

Clubhouse uses a circular green star symbol to mark the moderators of a room (SM; UR SM). All speakers inside a room have a marker to show whether their mic is muted or not. Speakers often click this marker on and off to indicate that they want a turn to speak (we noticed this when observing speakers in rooms). When users enter a room, they have a celebratory emoji by their icon and name to indicate that they are new to the room (SM). This can be seen in Figure 7.

All participants of rooms are required to follow Clubhouse’s Community Guidelines (ch-cg) (RG). However, established clubs can publish a list of club-specific rules that can be applied to participants of rooms hosted by the club. All audio communication on Clubhouse is ephemeral (EP) and occurs synchronously (SY). If a user decides to screen-record while in the app, the app will notify them that posting recordings without participants’ express permission is not allowed (RG; RG EP).

Figure 7. Views from inside a Clubhouse room. (Left) “Stage” that shows moderators and speakers. (Right) Listeners of a room, with new participants marked using a confetti emoji.

4.1.4. Clubhouse’s Guidelines for Moderators

There are novel etiquette guidelines that can be found on Clubhouse’s various information pages (RG): Knowledge Center (ch-kc), New User Guide (ch-nug), and the Community Guidelines (ch-cg). The Clubhouse Knowledge Center contains a page that has advice for “hosting” conversations. This page lists five “best practices for moderating conversations,” among which only one references handling antisocial behavior in Clubhouse rooms. The page states that moderators should “Remove non-cooperative or disruptive participants when needed” and “Report disruptive users by tapping the three dots on the top right of their profile.”

Another interesting observation from these pages is that the terms host and moderator appear to be used interchangeably—at the very least, the role of a room’s moderator is primarily to host events. In fact, the Community Guidelines (ch-cg) page describes that the job of moderators is to “guide the conversation and have a strong influence on the content and style of conversation in the room.”

4.1.5. Miscellaneous Markers

Clubhouse uses a marker as an implicit block-list mechanism. If User A is blocked by many users in User B’s network (either those that user B follows or those who are in User B’s phone contact list, if the app has to access it), user A’s profile will appear to have a black shield symbol with a white “!” mark (ch-cg). This is meant to indicate to user B that user A is blocked by many of their peers (SM). User A will not know that User B sees this symbol on their profile, and if User B does choose to block User A, then User A will be unable to join or participate in any room that User B creates or is speaking in. There is no public information available about the threshold number of blocks after which a user is given this symbol.

4.1.6. Clubhouse outside of the app

Much of the commentary about Clubhouse interactions happen on other platforms (IPD). One such platform that is heavily used by Clubhouse users for commentary is Twitter. Users often talk about what they are experiencing on Clubhouse on Twitter, and Clubhouse users will often link to their Twitter profiles in the Clubhouse app. There are even subreddits dedicated to talking about Clubhouse (i.e., r/Clubhouse). These other platforms are also used to announce and publicize rooms or clubs and invite new users to Clubhouse. The use of these platforms could be because users cannot communicate using other modalities (MD IDP), since there is no way of targeting new audiences inside the Clubhouse app besides relying on the explore page.

4.2. Participatory Observations of Clubhouse

We use participatory observations to help us understand the Clubhouse community in more detail; this also help us fill in the MIC diagram. The first and second authors were invited to join Clubhouse around January 2021. Since then, both authors have been using Clubhouse and following Clubhouse commentary that is posted on Twitter regularly. During the week of June 10th through June 17th 2021, the authors logged into the app at least twice a day. Each time, they noted down names of public rooms they encountered and the respective clubs these rooms belonged to. They also documented the number of users in each room, the number of speakers on the stage of each room. This information is visible to users even before they enter (see Figure 7). After documenting this information, they joined the room and counted the number of speakers who were assigned the moderator role. They also checked to see if there were any club rules. Club rules are the list of rules associated with a specific club. Not all clubs that they encountered had a rule list, but for the clubs which did, screenshots of these rules were taken for later analysis. We ensured that Clubhouse’s Community Guidelines and terms do not prohibit collecting this information via screenshots (and not audio or screen recordings).

4.2.1. Types of Rooms and Clubs

Public rooms on Clubhouse cover a diverse range of topics. We highlight some of the most common or notable types of rooms that we encountered in our week spent on the app. There are rooms that serve as an “Office Hours” for specific topics, like specific scientific domains, or topics relating to creating and running businesses. The speakers of these types of rooms are usually domain “experts” and users can enter the room and ask for advice or ask questions. These types of rooms are often associated with clubs related specifically to the domain of interest, so users who follow that Club get notified when they begin. Occasionally, these types of rooms have the word “REC” in the title, to indicate that the room is being recorded by the hosts to post for later consumption on other platforms such as YouTube or to distribute amongst community members via a mailing list or private website.

Of the 63 rooms we collected data for, 50 rooms were associated with Clubs. Many of the club-less rooms had a question in the title, where upon entering users can listen in or participate in discourse. Other, similar rooms describe a controversial event that occurred off the platform in the title that the room will be expected to discuss. For example, the title could be “YouTube removed [User name]’s video - discuss.” Since many of these rooms are public facing and have no associated clubs, anyone can enter to listen or participate in the discussion, even if the event or question falls outside the scope of awareness or interest of the user.

Some clubs, often related to Music or Meditation, host regular rooms that are for listening only where there is one speaker that is playing music to large audiences. Some clubs host rooms that are completely silent, where users can join to have quiet co-working time with other users, or where users can meditate silently in the presence of others. We encountered a room where users were doing a table read of movie scripts. In these rooms, each participant’s profile picture was that of the character they were speaking the role of. Audience members listened as the assigned speakers read through the entire movie script.

4.2.2. Number of participants, speakers, and moderators in Clubhouse rooms

The rooms we encountered had a variety of different sizes, speaker to moderator, and participants to speaker ratios. The largest room we encountered was one hosted by the official Clubhouse club that was a Town Hall event that Clubhouse hosts weekly (ch-kc). This room had 4000 members, 4 speakers, and 1 moderator. The smallest room encountered had 6 participants, 4 of the participants were speakers, and 2 of the speakers were moderators. This room had no associated club, and was a room that had a non-informative title, as in, it was unclear, at least to the authors of this paper, what the room was used for. We provide information on a handful of other rooms that had notable size or speaker-moderator-participant ratios in Table 2.

Room Club Room Subject Members Speakers Moderators
Largest Yes; Clubhouse HQ
Official Clubhouse
Town Hall
4000 4 1
Smallest No [Unclear] 6 4 2
Most Speakers No [Unclear] 177 144 3
Most Moderators
Yes; Club about Meditation
Music Event 609 134 27
2nd Largest w/
One Moderator
w/ [Moderator]
429 2 1
2nd Largest
# of Moderators
Yes; Club about dating
A Type of
319 25 19
Table 2. Variety of rooms encountered by the first and second authors, and the total number of users in the room (Members), the number of members who are speakers (Speakers), and the number of speakers who are moderators (Moderators).

4.2.3. Club-specific Rules

Clubs might be motivated to generate rules if they wish to enforce norms for new users, and because Clubhouse is a free range platform, new users could enter clubs’ public rooms at random (ST RG). so new app will Not all clubs choose to publish such a rule set. Of the 50 clubs we collected data from, 30 had club rules. Also, some clubs use the club rules space to post advertisements for external websites or businesses associate with the club.

Some common rule types we encountered were rules similar to those found in the app’s Community Guidelines. For instance, rules such as “Be respectful and kind” were common. Some clubs had a rule that required room speakers to have their Twitter or Instagram linked to their Clubhouse account so that moderators could further verify the identities of those they allowed on stage to speak. Some clubs rules asked all room participants to follow the moderators, and made following the moderators a requirement to be allowed to speak.

One club we encountered had a rule that asked for men to be respectful of women when they speak. One other club had a rule which asked speakers to mute and un-mute their microphone to indicate to moderators that they wanted to speak. This mute and un-mute behavior is something we encountered in many rooms, but was not something that was mentioned in other clubs’ rules, nor is it present in the community guidelines (ch-cg) or New User Guide (ch-nug).

A club that hosted rooms which often had users reference outside research or new articles had a rule asking participants to send names or links to any research articles they mentioned to the Club’s Instagram or Twitter. Another club’s rules mentioned the club’s Instagram account (MP IPD), which would be used to host text-discussion in the comments of the room-specific posts as the room was going on (as a type of live-chat that could occur off the app). Some rules we encountered were very club-specific. For instance, one notable rule prohibited club participants from sharing alien-encounter stories.

4.3. Comparing Clubhouse to Other ABSPs

Since we now have the MIC diagrams for Clubhouse, Discord, and Spotify, we can use them to identify similarities and differences. These comparisons, paired with the participatory observations from the previous subsection, will help set us up for Section 5, where we define research questions to gain more insights into moderation on Clubhouse to help us update its MIC diagram.

4.3.1. Clubhouse vs. Discord

While the voice channels of Discord servers may seem analogous to Clubhouse rooms, the two ABSPs are vastly different. An obvious difference between the two is that Discord is multimodal while Clubhouse is unimodal. Another big difference between the two lies in their structure. Discord is confined while Clubhouse is free-range, and rooms can be “stumbled upon” by users that are unfamiliar with the topic of the room.161616In fact, Clubhouse rooms do not actually need to be named, let alone be named with an informative title. On Discord, all voice channels must have a name. In this way, it is entirely possible for users to enter and participate in rooms about topics that they have no knowledge of. There are differences in each ABSP’s user roles as well. Clubhouse has only three types of roles for users, and one of these roles is that of Moderator. On Discord, server admins or owners can create any number of custom roles that are server specific, which may include a type of moderator role. While Clubhouse requires all its users to use their real identity, Discord allows users to pick any nickname, and also allows users to use server-specific aliases called “Server Nicknames.” Another key, and arguably the most stark, difference between Discord and Clubhouse is that every active room on Clubhouse has a moderator present; this is not the case on Discord (jiang19).

One similarity between the two ABSPs is that both have some kind of inter-platform dependency. Discord servers are often linked to by channels on Twitch or subreddits, and Discord servers are often advertised on Twitter. Commentary, promotion, and invites for Clubhouse seem to occur almost exclusively on Twitter and Reddit. Since Clubhouse is unimodular, it appears to have a heavier dependence on other platforms than Discord does. Clubhouse and Discord are also similar in that audio is both ephemeral and synchronous, for similar reasons as well (see: relationships between E, S, and RG in each’s MIC diagrams).

4.3.2. Clubhouse vs. Spotify

Though Spotify and Clubhouse seem vastly different, they do have some major commonalities. The main similarity is the free range structure of the two platforms. Users of Spotify and users of Clubhouse are able to explore and discover new content, and the topics from Clubhouse’s explore page are reminiscent of genres on Spotify. Another similarity is that both platforms seem to have a universal moderation purview, though the mechanisms for moderating audio may be different. Other similarities are that both Clubhouse and Spotify are audio-only and rely heavily on other platforms.

A big difference between Clubhouse and Spotify is that audio on Clubhouse is synchronous, but on Spotify is asynchronous. Likewise, audio on Clubhouse is ephemeral, but this is not the case on Spotify. Another difference lies in certain types of access available to users of each platform, since all users of Clubhouse have access to creating and consuming AC. On Spotify, only certain user types can create audio. There is some semblance of similarity in that Clubhouse limits the amount of users on its app using its invite-only set up and Spotify limits the amount of creators by pricing certain creator accounts; this type of access falls slightly outside of our definition of the access affordance. Clubhouse does not allow anonymity, and Spotify does, but seems to encourages identifiablity by allowing users to create accounts using Facebook.

5. Research Questions and Methodology for Investigating Moderation on Clubhouse

In this section, we motivate research questions about Clubhouse, and describe the methods we employ to answer these questions.

5.1. Motivations for Research Questions

Despite the fact that audio content on Clubhouse is synchronous and ephemeral, like on Discord, one particular affordance stood out during our MIC-based analysis. The moderation purview on Clubhouse appears to be universal since every active room has at least one member with the “moderator” role present. Furthermore, Clubhouse defines this moderator role differently than how Discord describes moderation.171717https://discord.com/moderation Discord’s guidelines for moderators emphasize the need to keep community members safe by preventing antisocial behavior, with little emphasis on the pro-social aspect of moderation, while Clubhouse’s guidelines focus more on promoting pro-social behavior (i.e. hosting conversations). Spotify, on the other hand, also has seemingly universal moderation purview, but no type of user moderator role. However, audio-content on Spotify is also only posted by certain types of users, whereas any user can create audio on Clubhouse.

Given that all rooms contain moderators (whose roles are not canonically defined), and audio-content can be created and consumed by all users (access to the various user roles on Clubhouse is not limited), we would like to examine how moderation on Clubhouse is perceived. This motivates our first research question.

Topic Keywords Number of tweets
Tweets mentioning Clubhouse
Moderation on Clubhouse (RQ1) moderation 295
Misinformation on Clubhouse (RQ2) misinformation 322
Harassment on Clubhouse (RQ2)
Table 3. Summary of Twitter data used to answer RQ1 and RQ2. Each dataset was generated by scraping public english tweets between April 1st 2020 and April 1st 2021 for tweets containing one of two Clubhouse specific keywords and keywords specific to the topics of each RQ.
  • [label=]

  • RQ1: How do users perceive moderation on Clubhouse, and what insights do these give us about the platform’s MIC affordances and relationships?

Another affordance of Clubhouse that differs from Discord is that the platform requires all users to use their real identities. Thus, users cannot interact with others anonymously, and anonymity has been known to promote antisocial and abusive behavior (christopherson07). Also, on Clubhouse, members of a room must be assigned the speaker role before being allowed to unmute their mic and speak (or make noises) in the room. This is not the case on Discord, since anyone that can enter a voice-room can start speaking right away. Jiang et al. (jiang19) identified specific types of antisocial behavior that occur on Discord (e.g., disruptions due to unpleasant noises), but many of them appear to be avoidable with Clubhouse’s restrictions on user roles and access. These differences motivate our second research question:

  • [label=]

  • RQ2: What types of antisocial behavior occur on Clubhouse, and what insights do these give us about the platform’s MIC affordances and relationships?

5.2. Compiling Tweets about Clubhouse for Qualitative Analysis

Since Clubhouse users are heavily dependent on platforms like Twitter to discuss Clubhouse and the events or incidents that occur on it, we analyze tweets from Clubhouse users to answer our RQs. We decided to use Twitter data instead of Reddit because Clubhouse users are able to link profiles to their Twitter account, but not to Reddit, which leads us to believe that Clubhouse users are more likely to use Twitter than Reddit.

To address both RQs, we scrape public tweets that were posted between April 1, 2020 and April 1, 2021. First, we use keywords “clubhouse” and “@joinClubhouse” to collect all tweets that could be about the Clubhouse app181818This included tweets that were not related to the app.. We refer to this as the Clubhouse tweet data set. Next, we use keywords that were specific to each research question to filter out tweets about pertaining to the topic of each RQ; these keywords were also more app-related, which helped to filter out tweets that were definitely about the Clubhouse app. In particular, we used the keyword “moderation” to filter out clubhouse-related tweets that talk about moderation to answer RQ1. Based on the themes that emerge in RQ1, we subsequently filtered tweets containing keywords like “misinformation”, “harass” and “harassment” to answer RQ2.

We acknowledge that the keyword-based selection criteria we employ could result in low recall (i.e., we might miss out on tweets that speak about moderation-related topics but do not use the specific keywords we look for) and this is a limitation. However, given the massive amounts of noisy data present in our Clubhouse tweet data set, we decided to opt for this high-precision keyword-based approach to filter out Clubhouse tweets that are specifically on-topic for RQ1 and RQ2. Descriptions of each of these data sets is presented in Table 3. After constructing the tweet data sets for RQ1 and RQ2, the first author hand-coded each of the tweets in each data set using an inductive approach (charmaz2006constructing), and these codes were used to uncover themes.

6. RQ1: Perceptions of Moderation on Clubhouse

Using the tweets about Moderation on Clubhouse (described in Table 3), we use qualitative analysis to uncover relevant themes and list them in Table 4. The themes are grouped into three different categories. For each category, we provide an overview of the themes it contains, and provide some examples of tweets that were tagged with these themes. All tweets have been paraphrased to reduce risk of uncovering the author’s identity. We then use these themes and any additional findings from the Twitter data sets to update and annotate our understanding of the MIC affordances and relationships on Clubhouse.

6.1. The Existence of and Need for Moderation on Clubhouse

Frequency Theme
15.6% Clubhouse does need moderation
9.2% Clubhouse has moderation
4.1% Clubhouse does not have moderation
4.1% Users want moderation
2.4% Clubhouse does not need moderation
28.5% Negative experience or opinion of moderation on Clubhouse
15.6% Positive experience or opinion of moderation on Clubhouse
6.1% Neutral experience or opinion of moderation on Clubhouse
11.8% Moderation on Clubhouse/audio platforms is difficult
9.8% Mentions of other platforms
7.46% Information, tips, and recommendations from moderators
4.7% Moderator role is a profession
3.7% Rooms discussing moderation
Table 4. Themes found in Moderation data set, and the frequency at which they occurred. Some tweets touched upon more than one theme, and so the total of frequencies exceeds 100%. Each section of themes represents one of the three broad categories for themes.

The themes from this category answer three overarching questions pertaining to user perceptions of moderation on Clubhouse. We report percentages based on the tweets that were analyzed from the moderation dataset (see Table 4 for a descriptive summary of themes we uncovered).

6.1.1. Does Clubhouse have moderation?

There were two contradicting answers for this question. 4.1% of tweets that we analyzed claimed that Clubhouse did not have any type of moderation on the app at all. While 9.2% of tweets expressed explicitly that Clubhouse already had moderation mechanisms in place and that moderation existed on Clubhouse. One such tweet expressing that Clubhouse did not have moderation was from a user who had not joined Clubhouse yet, and did not want to because there was no moderation:

No need to offer me an invite to the Clubhouse app. It looks like ground zero for hate speech and bad behavior, and there doesn’t seem to be any type of moderation.

6.1.2. Does Clubhouse need moderation?

Similar to the previous question, there were two conflicting types of answers to this question. There were more tweets expressing that Clubhouse needed moderation (15.6%) than there were tweets expressing that Clubhouse did not need moderation (2.4%). Tweets that expressed that Clubhouse needed moderation sometimes did not specify if Clubhouse already had moderation. More specifically, it was not clear whether the tweet was expressing that Clubhouse needs moderation because it did not have moderation, or that Clubhouse needed to improve on moderation that it currently has. Tweets expressing that Clubhouse did not need moderation stated that in real life, conversations between people are not moderated. Other tweets with a similar stance claimed that though moderation was needed, the responsibility to check bad behavior fell on the all members of the community, not just the moderators:

Stop blaming bad behavior on bad moderation. Room participants need to take more responsibility for their behavior.

6.1.3. Who wants moderation on Clubhouse?

Some tweets expressed that the author wanted moderation on the Clubhouse (again, it was rarely clear as to whether this implied that they wanted more moderation, or just for there to be moderation). Some tweets expressed that users at large wanted moderation on Clubhouse. One tweet with this sentiment expressed that users have been asking for more moderation, but Clubhouse appears to be ignoring these requests.

Users have been asking for more moderation and safety for months, but Clubhouse keeps giving us features we don’t want.

6.2. Experiences and Opinions about Moderation on Clubhouse

Many of the tweets discussed either some type of experience that a user encountered while using Clubhouse, or some type of opinion that a user holds about moderation on Clubhouse. Like the themes from the previous category, there were conflicting sentiments expressed by the tweets in this category. However, some tweets were neutral, or expressed both positive and negative sentiments.

6.2.1. Negative Experiences or Opinions

The majority of experiences and opinions of moderation on Clubhouse were negative (28.5%). Some users described experiences of bad moderation that made participating in the rooms challenging and unpleasant:

Too many people on stage makes me want to leave clubhouse rooms. I think its bad moderation if there are too many people on stage if it is unnecessary for the topic of the room.

Some tweets expressed that Clubhouse has a problem with moderation, sometimes going as far as stating that Clubhouse served as an example of a platform with bad moderation. One user explains that the “moderator” does not add value to the rooms:

I’m listening to [Twitter User] in @joinClubhouse. Clubhouse has a moderation problem. The “moderators” just fawn over the participants.

6.2.2. Positive Experiences or Opinions

Tweets that expressed positive sentiments (15.6%) about moderation on Clubhouse often reference specific rooms and moderators. These tweets did not provide too much detail as to how or why the experience was positive. Some tweets praised Clubhouse for its user experience and existing moderation.

[Twitter User] successfully hosted a room on a controversial topic last night. Good moderation prevents these rooms from turning in to yelling match’s.

Some tweets expressed that Clubhouse itself was doing a great job with moderation on the app. Again, these tweets rarely gave specific reasons or details as to why this was the case.

I think Clubhouse is the best app ever! I’m very impressed by the moderation they have, and it will only get better from here.

6.2.3. Neutral Experiences or Opinions

Neutral statements about moderation on Clubhouse (6.1%) merely mentioned or implied that moderation existed, or that the tweet author experienced instances of moderation on the Clubhouse app. Some tweets were marked as both positive and negative opinions and experiences, and therefore do not fall in this 6.1%. These tweets mention that the moderation in a room “makes or breaks” their experience - indicating that there have been both positive and both negative experiences.

When moderation is good, the experience inside Clubhouse rooms is also good. If moderation is not good, the conversation is not enjoyable.

6.3. Commentary about Moderation

The final category of themes we uncovered was general commentary about moderation or the role of moderators. Unlike the previous two categories, there were no conflicting sentiments from this category. We will highlight the two most frequent themes that emerged in this category.

6.3.1. Moderating Audio is Hard

Many tweets (11.8%) described how moderating Clubhouse, and audio-based social platforms in general, is difficult. Some tweets mentioned that synchronicity and ephemerality of audio make Clubhouse harder to moderate than text-based platforms:

It’s easier to stop bad arguments over text than on Clubhouse, because people may be so shocked in the moment that the incident can go by, whereas text allows someone to form a response after the fact.

Many tweets from this theme express that not every user of Clubhouse is cut out to be a moderator, since moderation is a nontrivial task. Though these tweets do not specifically mention why certain users do not have these skills, other than that moderating audio is difficult:

Not everyone has the skills to be able to moderate a room on Clubhouse

6.3.2. Moderation on Other Platforms

We found many tweets (9.8%) in this category mentioning other platforms, such as Discord, Twitter, Facebook, and YouTube. One tweet even drew a parallel between Clubhouse and telephone party lines (channel). Another tweet that mentioned other platforms stated that given how other SNSs are struggling with moderation, it was natural that there were moderation issues on Clubhouse, a relatively newer platform that is entirely audio-based and ephemeral:

If YouTube, Facebook, and Twitter still have issues with moderation for text-based posts, how is Clubhouse going to moderate well when its entirely audio-based? The app sounds like a mess, filled with misinformation and hate.

There were also tweets that compared Clubhouse to the Parler app, a text-based social media platform that was touted as a “alternative” social media where users could express free speech without the fear of being deplatformed (aliapoulios2021early). Parler itself was eventually pulled from app stores after the storming of the U.S. Capitol on January 6, 2021 (peters_lyons_2021).

6.4. Discussion of Findings and Updating the MIC Diagram

We now discuss our findings from the analysis of the moderation data set. We use these findings to update the Clubhouse MIC diagram. The updated MIC diagram can be found in Figure 8.

6.4.1. Limiting Access to the Role of a Moderator

The biggest insight we gained from exploring RQ1 is that Clubhouse’s moderation purview is actually not universal (MP), since it appears to be the case that the moderator role, or at the very least, those who assume the moderator role, do not perform all the activities of a moderator. This could also mean that allowing all users to have access to the moderator role (by letting all users start rooms) could be negatively affecting moderation purview (AC MP).

The themes we uncovered suggested that not all users should be given access to the moderator role, both because moderating a platform like Clubhouse is difficult (as is moderation in general) and some users simply do not know how to be an effective moderator. A causal relationship already exists from user roles to access in the MIC diagram (shown in Figure 5). We propose a corresponding relationship between the two affordances (AC UR), similar to that of Spotify (shown in Figure 3). In other words, access to creating and moderating rooms should be limited using the types of user roles (UR), and there should be limits to which users can access certain roles (AC). Along these lines, introducing different levels (and types) of access on Clubhouse could help address some of these moderation challenges (AC).

6.4.2. Improving Rules and Guidelines for Moderators

Clubhouse’s platform rules and guidelines may be a contributing factor to why moderation purview is actually limited (RG MP). This could be because the written guidelines for moderators do not have any concrete strategies to help them discourage or prevent antisocial behavior apart from kicking them out of a room or ending the room entirely.

7. RQ2: Antisocial Behavior Prevalent on Clubhouse

In the moderation-related Twitter data analyzed in the previous section, several tweets expressed that Clubhouse has problems with misinformation, harassment, racism, anti-semitism, sexism, and other types of antisocial behavior. We decided to examine, in detail, two of these problems that were most commonly discussed in our dataset—misinformation and harassment—which were also more general categories of antisocial behavior than the others we encountered. The remaining types of antisocial behavior prevalent on Clubhouse should be explored further in future work.

Next we present our findings from the qualitative analysis of the tweets about misinformation and harassment on Clubhouse (see Table 3 for dataset descriptions). By reflecting on the findings from each set of tweets, we update our MIC diagram for Clubhouse.

7.1. Thematic Findings from Tweets about Misinformation on Clubhouse

We list notable themes emerging from our qualitative analysis of the 322 tweets about misinformation on Clubhouse from the moderation data set can be found in Table 5. In this section, we describe two key themes that emerged in detail.

Frequency Theme
57.8% Clubhouse has a misinformation problem
8.1% Misinformation exists on other SNSs
2.8% Clubhouse does not have a misinformation problem
17.1% Medical misinformation
7.7% Instances of user being bullied for trying to correct misinformation
5.9% Moderating misinformation
5.9% List of abuse Clubhouse is susceptible to
4.6% Celebrities/Influential people spreading misinformation
Table 5. Notable themes from the misinformation data set. Not all tweets were represented by these themes.

7.1.1. Clubhouse has a Misinformation Problem

Over half of the tweets (57.8%) in the misinformation data set expressed that Clubhouse has a problem with dealing with misinformation. Many of these tweets referenced specific instances of misinformation being spread on the app. The most prominent type of misinformation being spread was medical misinformation (which is one of the themes listed in Table 5), and more specifically, misinformation related to COVID-19. Some tweets expressed that using voice enabled bad actors to spread misinformation easily:

The misinformation I see on Clubhouse is insane. People say questionable facts with so much conviction that it can lead impressionable people to believe lies.

A handful of tweets point out that Clubhouse does not explicitly define what misinformation is, and that this could enable the misinformation problem further since, as other tweets pointed out, there is not much distinction made between information that is false and harmful, or information that just represents a difference in opinion.

7.1.2. Users who try to Correct Misinformation get Bullied

We found that several tweets (7.7%) described instances where influential users were bullying less-influential users for trying to correct misinformation. There was one such instance in particular that was discussed most often. These tweets described that a well-known public figure, User A (who was not a medical professional), appeared to help spread information (this was another theme that was prevalent in 4.6% of tweets in the misinformation data set) which turned out to be inaccurate, and subsequently shut down another user, User B, (a verified medical professional) who tried to correct User A. User B was then heavily targeted, harassed, and eventually doxxed by supporters of User A.

Can’t believe [User A] was defending and spreading medical misinformation on Clubhouse. And when [User B] tried correcting the misinformation, they were severely harassed and doxxed! Disgusting.

7.2. Thematic Findings from Tweets about Harassment on Clubhouse

Some notable themes from our qualitative analysis of the 136 tweets in the harassment data set can be found in Table 6. We describe two of these themes in detail.

7.2.1. Clubhouse has a Harassment Problem

The most prominent theme was that Clubhouse has a problem with harassment (34.6%). Tweets that mentioned this theme also often mentioned specific instances of harassment that the tweet author witnessed or was a victim of. One tweet described harassers entering and exiting several rooms quickly to find the exact room their victim was speaking in and confront them.

Some of these tweets also described how harassment on Clubhouse spills over to other SNSs, which seems to amplify the amount of harassment received by victims, since now it comes from multiple platforms. In a similar vein, tweets described that dogpiling and swarming (similar to observations by (jhaver18)), from supporters of prominent figures on Clubhouse, aimed at specific individuals was also problematic, adding that this was something that had occurred on more than one occasion.

Frequency Theme
34.6% Clubhouse has a harassment problem
31.2% Example or report of harassment on Clubhouse
8.8% Bans or suspensions on Clubhouse
16.2% Mentioning sexual harassment
Table 6. Notable themes from the harassment data set. Not all tweets are represented by these themes.

7.2.2. Sexual Harassment is an Issue and Topic of Discussion on Clubhouse

The most prominent type of harassment mentioned in the harassment Twitter data set was sexual harassment. There were reports of negative incidents that were related to sexual harassment on the app. One tweet author described an incident they heard another Clubhouse user using damaging and problematic rhetoric about sexual harassment:

I just heard a guy on Clubhouse comparing sexual harassment to vanity.

However, we observed that tweets in this data set were not just made up of people reporting instances of sexual harassment that they experienced while using the Clubhouse. Many of these tweets were about active rooms where the topic of discussion was sexual assault or sexual harassment. These rooms functioned as productive and supportive spaces where victims of sexual abuse and sexual harassment could share experiences and receive support from their peers.

I was just in this room where survivors of sexual harassment were sharing their stories and crying together.

7.3. Discussion of Findings and Updating the MIC Diagram

We now discuss our findings from the analyses of the misinformation and harassment datasets. We use these findings to update the Clubhouse MIC diagram. The updated MIC diagram can be found in Figure 8.

7.3.1. Identifiablity has consequences, sometimes negative

We found that the identifiability of users plays a large role in antisocial behavior on Clubhouse (AN). One reason is because enforcing identifiability does not inhibit users from misrepresenting themselves and spreading misinformation. Another reason is that prohibiting anonymity allows for power dynamics between users, and this could lead to forms of harassment such as dogpiling and swarming (jhaver18).

Figure 8. Updated MIC-diagram for Clubhouse. The new arrows and marked affordances represent our findings from Section 6 and Section 7

7.3.2. Need Better Rules to Combat Misinformation

Clubhouse’s lack of a proper definition for misinformation appears to enable the spread of misinformation without consequence. A few tweets from our data set expressed that the lack of such a definition is because defining misinformation is a difficult task. Regardless of the difficulty, it is clear that the rules and guidelines for Clubhouse should be extended and be more precise about this type of consequential bad behavior.

In Section 4.2, we also described how some clubs had club rules that asked for speakers to provide references if they used numerical statistics; this may be that club’s mechanism for combating misinformation. If more clubs employed effective rules, then misinformation could be curbed. As such, it is clear that Clubhouse’s platform-level and club-level rules and guidelines could be edited or further developed to as a means to address moderation challenges (RG).

7.3.3. Inter-platform Dependence Amplifies Harassment Experiences

Clubhouse’s dependence on other platforms was also found to play a large role in antisocial behavior on the app (IPD), since targeted users end up being harassed on other platforms as well as on Clubhouse. This can significantly amplify the amount of harassment these individuals receive due to incidents that occur on Clubhouse.

7.3.4. Consequences of Free Range Structure

The tweets we analyzed mentioned rooms where users spoke about sexual harassment, and several examples highlighted how Clubhouse’s structure could contribute to moderation challenges (ST). The rooms that speak about sensitive subjects such as sexual assault are often public, and allow all types of users to find and join them. This is not inherently a bad thing, but the open-nature of these rooms could let in antisocial or unproductive behavior from bad actors.

8. Discussion

We discussed the findings from our qualitative analyses of Clubhouse-related tweets in Sections 7.3 and 6.4. In this section we will discuss the implications of MIC and detail potential extensions to our framework, providing concrete examples of these implications and extensions.

8.1. Implications for Theory and Design

For CMC and CSCW theory, our framework provides a new analytic lens to identify and understand the various affordances of ABSPs and the relationships between these affordances. Using this framework, we can explore how these affordances and relationships play a role in the ABSPs at large, contribute to moderation challenges, and how they can be used to develop mechanisms for moderation.

For platform design, MIC allows different stakeholders, like platform designers, moderation practitioners and researchers, to identify affordances and relationships to represent audio-based social platforms. MIC diagrams will allow stakeholders to introspect on affordances that contribute to moderation challenges, and also determine which affordances can be utilized (and fostered) to overcome these challenges. By comparing MIC diagrams across platforms, platform admins and moderators can adopt successful moderation techniques and best practices from other platforms with comparable affordances (and relationships). Next, we demonstrate how to do this by providing examples for Clubhouse.

8.1.1. Adopting Club or Room Discovery Strategies from Spotify

In Section 7.3.4 we discussed how the free range structure of Clubhouse may allow interlopers to access rooms about sensitive topics that are not exactly meant for them (such as rooms discussing sensitive topics like sexual harassment). Besides its impacts on moderation, showing users rooms or clubs that they are not interested in, or rooms that are public but not necessarily for them, makes finding and forming communities on Clubhouse difficult.

In Section 4.3.2 we pointed out that Clubhouse and Spotify have similar structures. One of Spotify’s major services is its recommendation system for music discovery. Not only does this service aim to show users music that they would be inclined to listen to, but also for artists to discover new listeners.191919Spotify For Artists has tools for musicians to help them understand and expand their audience. One way through which Spotify does this is by curating playlists. These playlists can be broadly defined, containing music from a genre, or from a specific musical artist. They can also be incredibly niche and designed to encapsulate certain moods.202020Examples of these niche “Editorial” playlists curated by Spotify include “Life Sucks,” “my life is a movie,” and astrology-themed playlists like “Libra” or “Leo” Many of these playlists are manually curated, and artists can submit music for consideration to be added to these curated playlists.

The topic categories from Clubhouse’s explore page (shown in Figure 6) are reminiscent of genres of music on Spotify. Therefore it would be reasonable for Clubhouse to adopt a similar type of recommendation-via-curation mechanism like Spotify, and manually curate endorsed playlist-type hubs with clubs or rooms that are hosted by trusted or experience users. This could start to help clubs and rooms find relevant audiences, and could also help users find and build communities in a more strategic way. This kind of curation will likely be more difficult to accomplish on Clubhouse, but it appears to be feasible.

8.2. Implications for Moderation on Clubhouse and other ABSPs

We found that moderators on Clubhouse apparently do not engage in moderation as defined by Grimmelmann (grimm), and that the role of moderator on Clubhouse is analogous to that of a host-type figure. Future work should explore how moderators on Clubhouse define the moderator role and how they define moderation, and how that differs from that of moderators on other ABSPs. In a similar vein, further analysis is required to understand the types of tools that human moderators of Discord and Clubhouse have access to, and understand how these tools (or lack of such tools) impacts how moderators operate on these platforms.

8.2.1. Developing Better Rules and Guidelines to Shape Community Norms

In Section 7 we discussed how Clubhouse’s platform-wide rules appear to fall short. Moreover individual clubs on clubhouse either did not have a published set of rules, or used the “rules and guidelines” section to post advertisements instead of rules. Some of the rules we did encounter in our participatory analysis did not seem to establish community norms, either because they were too vague (like Clubhouse’s platform-level guidelines), or because they could not be deciphered by users who were not members of the club (such as the first and second authors). In summary, we found that Clubhouse Rules and Guidelines appear to be incomplete or unclear. At the very least, the RG affordance on Clubhouse is different from the RG affordance for a platform like Discord. Discord has a clear set of guidelines for moderators, clearly defining their roles as caretakers of their servers. Therefore it would be reasonable for Clubhouse to adopt a similar strategy to clearly define rules prohibiting antisocial behavior and guidelines for shaping community norms within Clubs. A future direction for research could be to understand how users interpret Clubhouse’s guidelines, and determine whether there is disagreement among users’ interpretations that causes friction on the platform.

8.2.2. Examining the Effects of Combining Specific Affordances

Another potential avenue for exploration would be understanding how the combination of anonymity (AN) and audio-based content affects how users behave on ABSPs. This could provide contrasting work to work done on the use of audio in the online game settings, which found that users disliked voice because it violated their anonymity in a way. This could also be used to help understand the various pros and cons of anonymity even further, since we found that the lack anonymity on Clubhouse reinforced hierarchies among users but still did not prevent users from misrepresenting themselves for nefarious reasons.

8.2.3. Understanding inter-platform dependencies

Finally, we believe that there are many open questions pertaining to the Inter-platform dependencies (IPD) affordance from MIC. This echoes some of the findings of Kiene et al. (kiene19), who described how online communities will begin to use a variety of platforms and tools as a part of their infrastructure, as opposed to just one. In this way, it feels pertinent for moderation researchers to engage more deeply with inter-platform dependencies, and explore how platforms interact with one another to host communities.

8.3. The Benefits of Using MIC to Represent ABSPs

8.3.1. Accessibility and Efficiency

Clubhouse is a novel platform that was difficult to approach and navigate. However, MIC allowed us to create a concise representation of Clubhouse that contained information that was relevent to moderation. We were also able to efficiently construct MIC diagrams for Discord and Spotify using observations and prior work, which allowed us to concisely describe the differences between these three platforms. The accessibility of MIC will make navigating ABSPs in general easier, and will provide a concise and standardized way to represent ABSPs. We believe that this will make moderation research for ABSPs easier to approach and unify.

8.3.2. Connecting to Affordance- or Platform- Research From Other Domains

There is a decent amount of research done on audio-specific platforms that fall outside the scope of traditional SNS moderation research. For instance, Spotify uses music recommendations (tang2017evaluating), discovery (aguiar2018platforms), and curation mechanisms (morris2015control; brovig2021remix) as implicit forms of moderation. Analysis via MIC will direct us to these non-moderation affordance-specific works, even if we are not specifically investigation Spotify; this could occur from identifying similarities between Spotify and another ABSP, or simply by exploring audio-specific research.

8.4. Future Extensions and Generalizability of MIC

8.4.1. Extending MIC for ABSPs

MIC’s base set of affordances and relationships are likely to be non-exhaustive, and so we want to append to and update the framework with ease. Luckily, the graphical nature of MIC allows us to do so in an easy and straightforward way.

We can add new affordances to our original set when new types of affordances that effect moderation are uncovered or developed. We would also like to add more content-related moderation affordances. For instance, we may eventually find it useful to distinguish between types of audio-content, such as interactions, podcasts, or music. In order to do so, we could add a Content Type affordance to the subset of Content-related affordances in MIC.

Relatedly, it might make sense to add more granularity to each affordance by splitting them up. For example, if the moderator role for users becomes an integral or distinguished part of an ABSP, we could split the UR affordance into two: one that is moderator-specific, and one that describes other non-moderator user roles. Similarly, the MP affordance could potentially be split into two to consider the purview of human moderators and algorithmic moderators separately.

We can also extend our set of relationships by defining new types of relationships. There is no real restriction on how one could go about defining new relationships. We could even forego the condition that relationships occur between only two affordances, and describe relationships that are analogous to hyper-edges.212121In graph theory, a hyper-edge represents an “edge” that occurs between more than two vertices; it is usually represented by a subset of vertices.

8.4.2. Connecting MIC Diagrams

Another potentially useful, albeit more involved, extension of MIC, and in particular the MIC diagram would be to replace the inter-platform dependencies with a MIC diagram for other platforms or services. This would be useful if there is an (almost) symbiotic relationship between two separate ABSPs, or if we wish to consider the affordances of certain services on the same platform separately. For instance, we may wish to consider the Spotify for Artists and Anchor services separately from Spotify’s streaming service. We could analyze each of these three services separately, and then build an extended MIC diagram to understand moderation on Spotify in more detail. Discord recently announced a new feature, Discord Stages,222222https://discord.com/stages that we did not include when using Discord as an example; however, one could use MIC to analyze Discord Stages as though it was a separate platform, and then combine the two MIC diagrams to get a more granular understanding of the Discord platform as a whole.

8.4.3. Extending MIC to non-ABSPs

The extension discussed in the previous sections are a gateway for adapting and extending MIC to represent different audio-based services or features of SNSs that are not ABSPs. We could even create an even more generic version of MIC that would work on non-audio based SNSs entirely. This is feasible because the structure of MIC are groups of member-, infrastructure-, and content-related affordances. None of these groups are particularly audio-specific. In fact, none of the affordances or relationships assume the platform is audio-based (even synchronocity and ephemerality, since these are attributes of text-based content as well). Due to these reasons, we believe that our framework can be easily generalized and adapted to represent and examine non-ABSPs as well.

9. Conclusion

In this paper we proposed MIC, a new affordance-aware framework for navigating moderation on audio-based social platforms. MIC fills major gaps in existing analytical tools for moderation research since it is the first analytical framework for moderation on audio-based platforms. It is also the only framework for moderation research that explicitly considers the platform. MIC is a framework that provides a standardized way to represent ABSPs using its affordances and relationships among those affordances; the resulting representation is called an MIC diagram. Our proposed methodology for using MIC to learn about moderation involves dynamically learning and annotating the structure of an ABSP’s MIC diagram. We demonstrated how to use this methodology by studying the Clubhouse app. In particular, we conduct qualitative analysis of Clubhouse related tweets to understand how users perceive moderation on Clubhouse, and what types of antisocial behavior are prevalent on Clubhouse.