By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.

How Fandom Strengthens Community Safety and Unlocks Advertiser Growth with Coactive’s Multimodal AI

2.4M+
images uploaded monthly
125K+
videos moderated monthly, with second-by-second detection
74%
fewer manual hours on image moderation
95%
fewer manual hours on video moderation
50%
cost savings across trust & safety operations
Expanded
brand-suitable ad inventory across all formats
Founded
2004
Started using Coactive
2023
What they deliver
Community Experiences
Use Case
Content Moderation
Metadata Generation
Brand Suitability
http://www.fandom.com
Webinar
Watch on demand

About Fandom

The World’s Largest Fan Platform

Fandom reaches nearly 350 million unique visitors per month across 250,000+ wikis. From Game of Thrones to My Little Pony, fans turn to Fandom for knowledge, connection, and community. The platform also powers advertising for leading brands across entertainment and gaming.

The Scale of the Challenge

Millions of Uploads, Thousands of Risks

Every month, Fandom users upload more than 2.4 million images and 125,000 videos. Even if only 0.1% were harmful or unsuitable, the volume created:

  • Safety risks for users exposed to disturbing content
  • Complexity for advertisers, who need confidence that their campaigns appear only next to appropriate material
  • High operational cost, with 500+ contractor hours per week spent on manual moderation

Florent Blachot, VP of Data Science & Engineering at Fandom, put it simply:

Partnering with Coactive has transformed moderation from a bottleneck into a competitive advantage. We can now manage millions of uploads with confidence, protect our fans, and deliver the brand safety advertisers expect.

From Manual Burden to Automated Precision with Coactive

Coactive integrated directly into Fandom’s moderation stack, applying multimodal AI to images and video:

  • Automated moderation: 100% of images and videos reviewed instantly, with only edge cases surfaced for human review
  • Brand suitability filtering: Advertisers gain confidence their campaigns run only next to content aligned with their values
  • Quality assurance: Blurry, low-quality, or irrelevant uploads are flagged for re-upload
  • Enterprise-grade deployment: Coactive runs in a self-hosted environment, meeting Fandom’s security and scale requirements

Building Advertiser Confidence with Proactive Brand Suitability

Fandom worked with Coactive to understand gray area content that may be appropriate for fans but not for certain brands and surfaced video content that was not yet monetized to:

  • Expand brand-safe inventory across images and video, ensuring advertisers never run adjacent to inappropriate content
  • Maintain uninterrupted access to demand partners and DSPs (demand-side platforms advertisers use to buy inventory programmatically) by integrating Coactive into their brand safety and suitability framework
  • Decide on contextually appropriate content to monetize

Timothy Quievryn, Director of Community Safety, explained:

Community safety is always our first responsibility. Coactive strengthens that foundation and, at the same time, reassures advertisers that their campaigns are appearing in trusted, brand-appropriate contexts.

Looking Ahead

Expanding Brand Suitability and Modalities

Fandom and Coactive are exploring several next steps to deepen impact including scanning the historical backlog of 500M+ images to improve safety and remonetize, moderate new types of content, and boost relevance and performance of contextual advertising.

Download this Case Study

Coactive AI helps data teams extract insights from unstructured image and video data. It integrates visual data with familiar SQL and big data tools, using pre-trained models for trend analysis, content moderation, search, and mapping.

Thank you!
The case study has been successfully downloaded.