The experiment operated dynamically as participants used the platform normally during the 2024 presidential election. Over 1,000 users scrolled through their feeds believing they were seeing organic content, while behind the scenes AI systems analyzed posts in real-time and adjusted what appeared based on research objectives—demonstrating how invisibly algorithmic manipulation can occur.
This real-time capability distinguishes the research from retrospective analyses. The system didn’t examine historical data after the fact but operated live as users engaged with the platform. Posts appeared, were instantly analyzed for divisive content markers, and were either amplified or suppressed accordingly. This happened continuously throughout the experiment week as participants scrolled.
The invisibility of this manipulation mirrors everyday platform reality. Every social media user experiences constant algorithmic curation without awareness. Systems analyze behavior, content characteristics, and countless other variables to determine what appears in feeds. These decisions happen instantaneously and invisibly, shaping information environments without triggering conscious recognition.
What makes real-time manipulation particularly powerful is its adaptiveness. Rather than providing users with static content selections determined in advance, algorithms can respond dynamically to user behavior and contextual factors. If a particular post type generates strong engagement from specific users, algorithms can immediately show more similar content, creating feedback loops that intensify effects.
Understanding this real-time dynamic has implications for potential interventions. Simple content moderation that removes problematic posts after they appear may prove insufficient if algorithms are actively amplifying divisive content in real-time as it emerges. Effective interventions may need to address the amplification dynamics themselves rather than just removing the most extreme content after it has already reached massive audiences.