Closed-source papers on open source communities: a problem and a partial solution

In the Wikipedia research community — that is, the group of academics and Wikipedians who are interested in studying Wikipedia — there has been a pretty substantial and longstanding problem with how research is published. Academics, from graduate students to tenured faculty, are deeply invested and entrenched in an system that rewards the publication of research. Publish or perish, as we’ve all heard.   The problem is that the overwhelming majority of publications which are recognized as ‘academic’ require us to assign copyright to the publication, so that the publisher can then charge for access to the article.  This is in direct contradiction with the goals of Wikipedia, as well as many other open source and open content creation communities — communities which are the subject of a substantial amount of academic research.

Continue reading

The Lives of Bots

I’m part of a Wikipedia research group called “Critical Point of View” centered around the Institute for Network Cultures in Amsterdam and the Centre for Internet and Society in Bangalore.  (Just a disclaimer, the term ‘critical’ is more like critical theory as opposed to Wikipedia bashing for its own sake.)  We’ve had some great conferences and are putting out an edited book on Wikipedia quite soon.  My chapter is on bots, and the abstract and link to the full PDF is below:

I describe the complex social and technical environment in which bots exist in Wikipedia, emphasizing not only how bots produce order and enforce rules, but also how humans produce bots and negotiate rules around their operation.  After giving a brief overview of how previous research into Wikipedia has tended to mis-conceptualize bots, I give a case study tracing the life of one such automated software agent, and how it came to be integrated into the Wikipedian community.

The Lives of Bots [PDF, 910KB]

The Work of Sustaining Order in Wikipedia: The Banning of a Vandal

With the help of my advisor, Dr. David Ribes, I recently got a chapter of my master’s thesis accepted to the ACM conference on Computer Supported Cooperative Work, to be held in February 2010 in Savannah, Georgia. It is titled “The Work of Sustaining Order in Wikipedia: The Banning of a Vandal” and focuses on the roles of automated ‘bots’ and assisted editing tools in Wikipedia’s ‘vandal fighting’ network.

Abstract: In this paper, we examine the social roles of software tools in the English-language Wikipedia, specifically focusing on autonomous editing programs and assisted editing tools. This qualitative research builds on recent research in which we quantitatively demonstrate the growing prevalence of such software in recent years. Using trace ethnography, we show how these often-unofficial technologies have fundamentally transformed the nature of editing and administration in Wikipedia. Specifically, we analyze „vandal fighting‟ as an epistemic process of distributed cognition, highlighting the role of non-human actors in enabling a decentralized activity of collective intelligence. In all, this case shows that software programs are used for more than enforcing policies and standards. These tools enable coordinated yet decentralized action, independent of the specific norms currently in force.

Download the full paper (PDF)

Wikisym Poster: The Social Roles of Bots and Assisted Editing Tools

This project investigates various software programs as non-human social actors in Wikipedia,
arguing that their influence must not be overlooked in research of the on-line encyclopedia
project. Using statistical and archival methods, the roles of assisted editing programs and bots are
examined. First, the proportion of edits made by these non-human actors is significantly more
than previously described in earlier research. Second, these actors have moved into new spaces,
changing not just the practice of article writing and reviewing, but also administrative work.

This week, I’m presenting a poster at WikiSym 2009 on “The Social Roles of Bots and Assisted Editing Tools.”  Most of the work is distilled from my thesis.

Abstract: This project investigates various software programs as non-human social actors in Wikipedia, arguing that their influence must not be overlooked in research of the on-line encyclopedia project. Using statistical and archival methods, the roles of assisted editing programs and bots are examined. First, the proportion of edits made by these non-human actors is significantly more than previously described in earlier research. Second, these actors have moved into new spaces, changing not just the practice of article writing and reviewing, but also administrative work.

Download the Poster (PDF)

Download the Extended Abstract (PDF)

And if you are interested in this topic, check out the full paper, The Work of Sustaining Order in Wikipedia: The Banning of a Vandal.

WikiConference New York: An Open Unconference

Jimmy Wales speaking at the conference keynote, by Laurence Perry, CC BY-SA 3.0

Jimmy Wales speaking at the conference keynote, by GreenReaper, CC BY-SA 3.0

A few months ago, I had the pleasure of presenting at the first (hopefully annual) WikiConference New York, sponsored by the Wikimedia New York City chapter with assistance from Free Culture @ NYU and the Information Law Institute at NYU’s law school. I know that I am atrociously late in writing this post, but I’m not really writing it for the Wikipedians out there. Rather, the WikiConference was an interesting experiment that seemed to apply Wikipedia’s philosophy towards editing to a conference, resulting in what the organizers called a “modified unconference.”
Continue reading

Working Within Wikipedia: Infrastructures of Knowing and Knowledge Production

Here are the slides from a paper I presented at the Science and Technology in Society Conference, hosted by the AAAS this past weekend. I won an award for top paper in my section for it – so I’m pretty happy about it. The full paper is not up because it is a Frankenstein assemblage from my thesis, which I’ll be finishing up in less than a month.

Continue reading

Do you support Wikipedia? News from the Trenches of the Science Wars 2.0

This is a paper I wrote for a class on “Technology and Critique” – a class that blended critical theory with Science and Technology Studies.  Taking from Bruno Latour’s “Do you believe in Reality?  News from the Trenches of the Science Wars,” this work is a critical examination of the way in which the on-line encyclopedia Wikipedia has been implicitly cast as a continuation of the Science Wars.  Instead of debating about the efficacy and authority of science, academics are now debating the efficacy and authority of Wikipedia. Using Martin Heidegger’s work on ontology and technology, I argue that this particular academic mindset is a way of being-in-the-world that works to either affirm or negate the integration of Wikipedia into its particular projects – namely, the production of academic knowledge.  However, I show that asking whether Wikipedia is a reliable academic source enframes Wikipedia into an objectless standing-reserve of potential citations, foreclosing many other possibilities for its use.  Instead of following Steven Colbert and countless academics by asking what Wikipedia has done to reality, I ask: what have we done to Wikipedia in the name of reality?

Continue reading

Researching Wikipedia Holistically: A Tentative Approach

This is a tentative article-length introduction to my thesis on Wikipedia. It is an attempt to analyze Wikipedia from an interdisciplinary perspective that tries to make problematic various assumptions, concepts, and relations that function quite well in the “real world” but are not well-suited to studying Wikipedia. I begin by talking about the nature of academic disciplines, then proceed to a detailed but sparse review of certain prior research on Wikipedia. By examining the problems in previous research within the context of disciplines, I establish a tentative methodology for a holistic study of Wikipedia.
Continue reading