When the Levee Breaks: Without Bots, What Happens to Wikipedia’s Quality Control Processes?

less than 1 minute read

Published:

I’ve written a number of papers about the role that automated software agents (or bots) play in Wikipedia, claiming that they are critical to the continued operation of Wikipedia. This paper tests this hypothesis and introduces a metric visualizing the speed at which fully-automated bots, tool-assisted cyborgs, and unassisted humans review edits in Wikipedia. In the first half of 2011, ClueBot NG – one of the most prolific counter-vandalism bots in the English-language Wikipedia – went down for four distinct periods, each period of downtime lasting from days to weeks. Aaron Halfaker and I use these periods of breakdown as naturalistic experiments to study Wikipedia’s quality control network. Our analysis showed that the overall time-to-revert damaging edits was almost doubled when this software agent was down. Yet while a significantly fewer proportion of edits made during the bot’s downtime were reverted, we found that those edits were later eventually reverted. This suggests that human agents in Wikipedia took over this quality control work, but performed it at a far slower rate.

revert