Here are the slides from a paper I presented at the Science and Technology in Society Conference, hosted by the AAAS this past weekend. I won an award for top paper in my section for it – so I’m pretty happy about it. The full paper is not up because it is a Frankenstein assemblage from my thesis, which I’ll be finishing up in less than a month.
We throw around the words “collective intelligence” and “wisdom of the crowds” quite a bit to describe “Web 2.0” sites like Wikipedia, but we hardly define what we mean when we use any of those terms, which is why they largely remain scare-quoted. Because of this, the door has been left wide open for scientists and journalistic defenders of science to critique Wikipedia and other social media sites as being relativist, collectivist mobs who can do no more than aggregate the baseline opinion of what the masses perceive to be Truth. While Wikipedia does have epistemic standards, the open question is how such an epistemology can be operationalized and enforced. To answer such a question, I examine Wikipedia in light of a distinction between an infrastructure of knowing (everything required to evaluate a statement as true/false) and an infrastructure of knowledge production (everything required to bring forth new statements with claims to truth/falsity). While the Wikipedian epistemology on the encyclopedic level is purely evaluative, refusing to publish original research and instead relying on reliable sources, this process is made possible by a non-encyclopedic form of knowledge production.
In short, in order for there to exist an infrastructure of knowing such that the evaluation of encyclopedia articles becomes possible, there must exist an infrastructure of knowledge production to generate and evaluate claims regarding the acts of editing. These include statements like “this edit is vandalism and needs to be reverted” or “this user is disruptive and needs to be blocked” – which require their own epistemic order for evaluation. Taking a cue from laboratory studies of scientific practice, I detail the way in which epistemic standards are “black boxed” into material technologies. In the same way that a mass spectrometer is the reification of dozens of now-unproblematic theories from physics, chemistry, and mathematics, so do various technological programs used by self-described “vandal fighters” reify Wikipedia’s epistemic standards. Similarly, in the same way that various technologies had to be developed to allow experimental science to trump philosophical reasoning (like laboratory reports, which made experimental findings circulatable), so have various technologies been developed that make Wikipedia’s mechanisms of epistemic verification and enforcement possible.
By detailing all the human and non-human actors at work in the banning of a vandal, I show how a group of seemingly-disconnected editors contributed to a process of knowledge production necessary for the enforcement of epistemic standards. In this way, collective intelligence was made possible in Wikipedia, but not because of a mystical or anarchistic wisdom of crowds. Instead, these encyclopedic epistemic standards were able to be enforced because various human and non-human actors were constantly working to hold together an infrastructure of non-encyclopedic knowledge production.
Link: Working Within Wikipedia: Infrastructures of Knowing and Knowledge Production (PDF, 901 KB)