Thursday, May 15, 2008

Verifiability and Control Policy Guidelines: Evaluating Quality for Wikipedia

The immense volume of information available online means that it is becoming increasingly harder for users of the internet to single out data that is verifiable and trustworthy. The ability to evaluate the quality of online resources is fast becoming essential to developing discerning internet literacy skills, and ways in which quality is evaluated on wiki-based communication portals is now more vital than ever. One of the most well known wikis being used worldwide is Wikipedia – a collaborative website which relies on the active participation of willing produsers to update and upload information. There are currently 2,371,994 articles in the English Wikipedia, and the upward trend line in the graphs below (see link for larger image) demonstrate its popularity through the annual increase in article count and growth rate of the site.




The future struggle for online users can therefore be pinpointed as not being a lack of information, but to sort and aggregate available information by developing avenues that “genuinely enhances our understanding, and to screen out the rest” (Blood 2002, 12).

One of my previous posts asked whether digital technologies aid or hinder productivity, and the issue of evaluating quality is intrinsic to this emerging problem. One could put forward the argument that if we – as internet publics – were able to sort through information and evaluate the quality of data in more efficient ways, the ‘overloaded’ cyberspace we now deal with could potentially become more manageable. This concept can also be applied to Wikipedia, and through its evolution, a variety of community-based positives and controversy-based negatives have plagued its existence to date.

In a specific case study, one can examine wiki online publications as being a genre currently under heightened scrutiny as readers constantly battle with the uncertainty of source information. These vulnerabilities of flawed information can however actually decrease with the amount of participants actively contributing through proactive correction. In her lecture Wikipedia: Adding Your Knowledge, Collis (2008) argues that the old stigma of Wikipedia is being gradually eroded due to its growing credibility as a medium. In the education context, it is interesting to note that acceptability for students to use and reference Wikipedia in assignments is now encouraged by many media academics due to the assumed high-vetting mechanisms data must withstand to remain published (Collis 2008). This shift shows the change in attitudes towards the perceived value of the ‘human store of knowledge’, however still poses problems regarding the quality of articles of a more sensitive nature.

The question of how is quality evaluated is at the core of this discussion and this question is dependent on the types of answers one is looking for. Jenkins (quoted by Bruns 2008, 108) states that “the Wikipedia community, at its best, functions as a self-correcting adhocracy. Any knowledge that gets posted can and most likely will be revised and corrected by other readers.” As mentioned earlier, many academics are in congruence with this view, explaining one way in which wikis benefit from being subjected to the scrutiny of millions.

Problems have arisen with controversial topics for example topics such as 9/11 conspiracy theories, calorie restriction, homosexuality and Peter Falconio, and wikipedia administrators have the power to freeze pages to stop any further editing. The constant battle between opinion and hearsay, fact and verifiability are largely demonstrated by Wikipedia and Laugher (2008), a Wikipedia editor, identifies some policies and guidelines for the site below, established to maintain quality control:

  • Neutral point of view should be used
  • Tone should be consistent with article
  • No sock puppetry – development of multiple profiles to publish similar content
  • No edit warring – constant overruling of another’s posts
  • Verifiability should be upheld through justification, not whether an author believes published material is true or not.

The quality of articles on Wikipedia is entrenched in the policy guidelines which stipulate the correct application of a neutral point of view and verifiability be properly applied to any published work. Along with these are the rigorous monitoring of Wikipedia staff, and WikiScan, scanning IP Addresses, having identified multiple scandals including one involving the Australian Prime Minister’s Office and Department of Defence spending hours of tax-payer time to edit their Wikipedia pages.

The abovementioned examples bring light to the vulnerabilities of the Wiki format, as examined through the well-established Wikipedia site. As improvements in technology begin to eradicate problems slowly, the quality of these articles are becoming less compromised and more valued in the knowledge based economy.

References

Blood, R. 2002. The Weblog Handbook. United States of America: Perseus Publishing.

Bruns, A. 2008. Wikipedia: Representations of knowledge. In Blogs, wikipedia, second life, and beyond: From production to produsage. 101-146. New York: Peter Lang. https://cmd.qut.edu.au/cmd/KCB201/KCB201_BK_163521.pdf (accessed May 15, 2008).

Collis, C. 2008. KCB201 virtual cultures: Week 11 lecture notes. Wikipedia: expanding your knowledge. QUT: Kelvin Grove.

Computerworld. 2008. Who’s behind Wikipedia?
http://www.computerworld.com.au/index.php/id;1866322157;fp;4;fpid;1968336438;pf;1 (accessed 15 May, 2008).

No comments:

Snurblog - Axel Bruns