In the Wikipedia research community — that is, the group of academics and Wikipedians who are interested in studying Wikipedia — there has been a pretty substantial and longstanding problem with how research is published. Academics, from graduate students to tenured faculty, are deeply invested and entrenched in an system that rewards the publication of research. Publish or perish, as we’ve all heard. The problem is that the overwhelming majority of publications which are recognized as ‘academic’ require us to assign copyright to the publication, so that the publisher can then charge for access to the article. This is in direct contradiction with the goals of Wikipedia, as well as many other open source and open content creation communities — communities which are the subject of a substantial amount of academic research.
With the help of my advisor, Dr. David Ribes, I recently got a chapter of my master’s thesis accepted to the ACM conference on Computer Supported Cooperative Work, to be held in February 2010 in Savannah, Georgia. It is titled “The Work of Sustaining Order in Wikipedia: The Banning of a Vandal” and focuses on the roles of automated ‘bots’ and assisted editing tools in Wikipedia’s ‘vandal fighting’ network.
Abstract: In this paper, we examine the social roles of software tools in the English-language Wikipedia, specifically focusing on autonomous editing programs and assisted editing tools. This qualitative research builds on recent research in which we quantitatively demonstrate the growing prevalence of such software in recent years. Using trace ethnography, we show how these often-unofficial technologies have fundamentally transformed the nature of editing and administration in Wikipedia. Specifically, we analyze „vandal fighting‟ as an epistemic process of distributed cognition, highlighting the role of non-human actors in enabling a decentralized activity of collective intelligence. In all, this case shows that software programs are used for more than enforcing policies and standards. These tools enable coordinated yet decentralized action, independent of the specific norms currently in force.
This week, I’m presenting a poster at WikiSym 2009 on “The Social Roles of Bots and Assisted Editing Tools.” Most of the work is distilled from my thesis.
Abstract: This project investigates various software programs as non-human social actors in Wikipedia, arguing that their influence must not be overlooked in research of the on-line encyclopedia project. Using statistical and archival methods, the roles of assisted editing programs and bots are examined. First, the proportion of edits made by these non-human actors is significantly more than previously described in earlier research. Second, these actors have moved into new spaces, changing not just the practice of article writing and reviewing, but also administrative work.
And if you are interested in this topic, check out the full paper, The Work of Sustaining Order in Wikipedia: The Banning of a Vandal.
A few months ago, I had the pleasure of presenting at the first (hopefully annual) WikiConference New York, sponsored by the Wikimedia New York City chapter with assistance from Free Culture @ NYU and the Information Law Institute at NYU’s law school. I know that I am atrociously late in writing this post, but I’m not really writing it for the Wikipedians out there. Rather, the WikiConference was an interesting experiment that seemed to apply Wikipedia’s philosophy towards editing to a conference, resulting in what the organizers called a “modified unconference.”
Here are the slides from a paper I presented at the Science and Technology in Society Conference, hosted by the AAAS this past weekend. I won an award for top paper in my section for it – so I’m pretty happy about it. The full paper is not up because it is a Frankenstein assemblage from my thesis, which I’ll be finishing up in less than a month.