skip to main content
^
Back
to Top

ACES

Context

Problem: The rise of disinformation in online spaces across the world, from organized misinformation attacks, to the prevalence and rise of anti-information groups like anti-vaxxers.

Goal: Design and explore the feasibility of an Accountability & Content Evaluation System (ACES) with incentives for users to engage in community-based content moderation.

  • Role: User Researcher
  • Team Size: 4
  • Duration: 4 weeks
  • Tools: Figma, Usability Tests
Download Full Report

Process Overview

Literary Research:

My team and I reviewed several HCI papers on social media moderation and conducted competitive analysis to see what the current methods for moderation are, as well as their strengths and weaknesses. We also scoped down our design to move away from a fact checking framework, and instead, designed it to act as a basis for establishing epistemological media literacy, or in other words, provide context instead of flag posts as false.

Design:

To address the points we found in our research we split our design space into two different systems: Post Evaluation, and Gamification. In post evaluation we came up with a 4 tiered system, with each tier slowly building upon the previous stage so as to not give any one user too much power or work. However having a potential moderator only able to do 1/4 the work of moderating a post steeply increases the amount of community moderators needed. So we designed a currency that could be exchanged for ad removal, post visibility, or gifts. We also designed badges, achievements and stats summaries that could be shown off on the user's profile for further incentivization.

Evaluation:

After coming up with final designs of each section of the system, we made low fidelity prototypes that were scripted together to be clicked through and ran usability tests to evaluate them. The main takeaway of this testing led us to shift our design by having the warning banner for contentious posts appear half way into the video, in theory, after the viewer gets invested but before they make up their mind.

Conclusion:

As social media companies grow, so do the unique nuances in their diaspora of internet cultures. This makes traditional moderation harder and harder as it must adapt to the specific subgroup it is moderating. Incentivising a community to perform its own moderation with tools that provide them the ability to foster perspective answers this concern.