Rolli
Research / Disinformation Analysis

Academic Consortium: Citation-Grade Disinformation Data at Scale

A consortium of five research universities needed cross-platform social data with methodology documentation meeting peer-review standards — without building their own pipeline. Rolli's API delivered normalized engagement data, authenticity scores, and reproducible exports across 8 platforms, enabling a 14-month longitudinal study that produced three peer-reviewed publications.

Published: Multi-University Research Consortium · January 2026

14 mo
Longitudinal study enabled

Rolli IQ

Intelligence Investigation

3
peer-reviewed publications produced using Rolli API data
8
platforms covered under a single normalized schema
14 mo
longitudinal study enabled without rebuilding collection infrastructure
94.2%
authenticity scoring precision on known CIB validation datasets

The Challenge

Academic disinformation research has a persistent infrastructure problem: the data requirements for rigorous cross-platform studies exceed what any single institution can maintain with its own crawling and collection infrastructure. Platform API access is inconsistent, rate-limited, and changes without notice. Normalizing data across platforms with different engagement metrics, content formats, and moderation regimes is a months-long engineering project before any research begins.

The consortium's specific challenge was a 14-month study tracking coordinated inauthentic behavior across three distinct policy events — requiring consistent data collection methodology, cross-platform normalization, and authenticity scoring that could survive peer-reviewer scrutiny. They needed a data source they could cite, not a one-off pipeline they'd built themselves.

The Approach

Rolli's API was integrated into the consortium's research infrastructure with a dedicated workspace providing normalized access to all 8 monitored platforms under a single schema. Data freshness guarantees, collection methodology documentation, and authenticity score transparency reports were provided as part of the research agreement — meeting the documentation requirements for methods sections in academic publications.

The consortium tracked three trigger events over 14 months: a federal legislative vote, a public health announcement, and a foreign policy decision. For each event, Rolli API provided real-time cross-platform data from the moment the event broke, with consistent scoring methodology across all three events enabling direct longitudinal comparison.

Authenticity score calibration data — documenting how the scoring model was constructed, its precision and recall rates on known CIB datasets, and its limitations — was provided to the team for disclosure in publications. This transparency is what made the data source citeable under peer-review standards.

The Findings

  • 3peer-reviewed publications produced using Rolli API data
  • 8platforms covered under a single normalized schema
  • 14 molongitudinal study enabled without rebuilding collection infrastructure
  • 94.2%authenticity scoring precision on known CIB validation datasets

The methodology documentation Rolli provided was the difference between data we could cite and data we couldn't. Three journals accepted work built on this infrastructure.

Principal InvestigatorMulti-University Research Consortium

See Rolli IQ applied to your issues

Request a demo and see a live intelligence demonstration on a topic relevant to your team.

Joined this week by 47 communications, security, and research teams

Request a DemoStart Free Trial
Join 400+ organizations protecting their reputation. Average setup: 8 minutes.

Monitoring live in under 2 minutes  ·  No credit card  ·  Cancel anytime  ·  SOC 2–aligned