Thursday, May 15, 2008

Debate in the 'Wiki World': Edit Wars

The controversies that surround the different opinions of online participants in Wikis has given rise to a number of 'edit wars', when individuals or groups of editors of a particular Wiki become embroiled in an online argument based on whose slant on editing is deemed 'correct'. These battles can sometimes escalate into fierce exchanges, however "most participants in these arguments recognise that it is (mostly) tongue-in-cheek" (Wikipedia).

Bruns (120, 2008) refers to these occurrences as 'revert wars', emerging "between opposing factions which in turn remove one another's page edits" noting that "the deliberate defacement of oppositional pages, do[es] occur".

Below is a short list of edit wars as identified by Computerworld (2008).
  • Labelling of 'petrol' or 'gasoline'.

  • Where Nicolas Tesla, Freddie Mercury, Copernicus and Jennifer Aniston were really born.

  • Whether the symbol for C# programming language should be written with a hash or with the musical sharp symbol.

  • Whether the planet Pluto should be referred to as 134340 Pluto, or just plain Pluto.

  • Whether a Queen dead for over a century should still be referred to as 'Her Majesty'

  • Bill & Ted's Excellent Adventure: Was it released in 1988, or 1989?

  • The real height of Andre the Giant.

  • The Death Star. Is it 120km or 160km in diameter? Even 900km? Is the hyper drive class 3 or 4?

  • Are potato chips flavored or flavoured - as a compromise they become seasoned.

  • Periods vs full stops, and

  • What really goes into an Irish breakfast?


For further reading see the full list of 'lamest edit wars ever'.

As demonstrated by this list, the trivialities of social and culturally relevant terms become open debate in the 'Wiki World', often with humorous results.



References

Bruns, A. 2008. Wikipedia: Representations of knowledge. In Blogs, wikipedia, second life, and beyond: From production to produsage. 101-146. New York: Peter Lang. https://cmd.qut.edu.au/cmd/KCB201/KCB201_BK_163521.pdf (accessed May 15, 2008).


Computerworld. 2008. Who’s behind Wikipedia?
http://www.computerworld.com.au/index.php/id;1866322157;fp;4;fpid;1968336438;pf;1 (accessed 15 May, 2008).

Verifiability and Control Policy Guidelines: Evaluating Quality for Wikipedia

The immense volume of information available online means that it is becoming increasingly harder for users of the internet to single out data that is verifiable and trustworthy. The ability to evaluate the quality of online resources is fast becoming essential to developing discerning internet literacy skills, and ways in which quality is evaluated on wiki-based communication portals is now more vital than ever. One of the most well known wikis being used worldwide is Wikipedia – a collaborative website which relies on the active participation of willing produsers to update and upload information. There are currently 2,371,994 articles in the English Wikipedia, and the upward trend line in the graphs below (see link for larger image) demonstrate its popularity through the annual increase in article count and growth rate of the site.




The future struggle for online users can therefore be pinpointed as not being a lack of information, but to sort and aggregate available information by developing avenues that “genuinely enhances our understanding, and to screen out the rest” (Blood 2002, 12).

One of my previous posts asked whether digital technologies aid or hinder productivity, and the issue of evaluating quality is intrinsic to this emerging problem. One could put forward the argument that if we – as internet publics – were able to sort through information and evaluate the quality of data in more efficient ways, the ‘overloaded’ cyberspace we now deal with could potentially become more manageable. This concept can also be applied to Wikipedia, and through its evolution, a variety of community-based positives and controversy-based negatives have plagued its existence to date.

In a specific case study, one can examine wiki online publications as being a genre currently under heightened scrutiny as readers constantly battle with the uncertainty of source information. These vulnerabilities of flawed information can however actually decrease with the amount of participants actively contributing through proactive correction. In her lecture Wikipedia: Adding Your Knowledge, Collis (2008) argues that the old stigma of Wikipedia is being gradually eroded due to its growing credibility as a medium. In the education context, it is interesting to note that acceptability for students to use and reference Wikipedia in assignments is now encouraged by many media academics due to the assumed high-vetting mechanisms data must withstand to remain published (Collis 2008). This shift shows the change in attitudes towards the perceived value of the ‘human store of knowledge’, however still poses problems regarding the quality of articles of a more sensitive nature.

The question of how is quality evaluated is at the core of this discussion and this question is dependent on the types of answers one is looking for. Jenkins (quoted by Bruns 2008, 108) states that “the Wikipedia community, at its best, functions as a self-correcting adhocracy. Any knowledge that gets posted can and most likely will be revised and corrected by other readers.” As mentioned earlier, many academics are in congruence with this view, explaining one way in which wikis benefit from being subjected to the scrutiny of millions.

Problems have arisen with controversial topics for example topics such as 9/11 conspiracy theories, calorie restriction, homosexuality and Peter Falconio, and wikipedia administrators have the power to freeze pages to stop any further editing. The constant battle between opinion and hearsay, fact and verifiability are largely demonstrated by Wikipedia and Laugher (2008), a Wikipedia editor, identifies some policies and guidelines for the site below, established to maintain quality control:

  • Neutral point of view should be used
  • Tone should be consistent with article
  • No sock puppetry – development of multiple profiles to publish similar content
  • No edit warring – constant overruling of another’s posts
  • Verifiability should be upheld through justification, not whether an author believes published material is true or not.

The quality of articles on Wikipedia is entrenched in the policy guidelines which stipulate the correct application of a neutral point of view and verifiability be properly applied to any published work. Along with these are the rigorous monitoring of Wikipedia staff, and WikiScan, scanning IP Addresses, having identified multiple scandals including one involving the Australian Prime Minister’s Office and Department of Defence spending hours of tax-payer time to edit their Wikipedia pages.

The abovementioned examples bring light to the vulnerabilities of the Wiki format, as examined through the well-established Wikipedia site. As improvements in technology begin to eradicate problems slowly, the quality of these articles are becoming less compromised and more valued in the knowledge based economy.

References

Blood, R. 2002. The Weblog Handbook. United States of America: Perseus Publishing.

Bruns, A. 2008. Wikipedia: Representations of knowledge. In Blogs, wikipedia, second life, and beyond: From production to produsage. 101-146. New York: Peter Lang. https://cmd.qut.edu.au/cmd/KCB201/KCB201_BK_163521.pdf (accessed May 15, 2008).

Collis, C. 2008. KCB201 virtual cultures: Week 11 lecture notes. Wikipedia: expanding your knowledge. QUT: Kelvin Grove.

Computerworld. 2008. Who’s behind Wikipedia?
http://www.computerworld.com.au/index.php/id;1866322157;fp;4;fpid;1968336438;pf;1 (accessed 15 May, 2008).

Wednesday, May 14, 2008

Social Touch Points: Where Marketing and Media Converge

There are many ways in which social networking is becoming increasingly intrinsic to marketing campaigns with a young target market. By diversifying advertising and communications through social sites a more direct and targeted exposure is presumably achieved.

I just came across another example of this in the new campaign for Pump water, a Coca-Cola owned brand. Together with Sound Alliance, a fortnightly produced dance show called inthemix.tv will be created for viewers and available through inthemix.com.au to be streamed online, downloaded to your PC, Mac, iPod or mobile (Inthemix Local News).

Interesting placement opportunities arise for online networks and communities with this form of 'episodic' digital marketing, as this data can be "embedded into personal blogs, Facebook pages...(and) forum posts" whilst also being tagged through social bookmarking sites such as del.icio.us, digg and stumbled upon.

The multi-faceted benefits of such flexible data can fully be seen in the crossover of the media and digital industries, converging in new and exciting touch points for interested audiences which result in heightened levels of engagement.

Wednesday, May 7, 2008

Do Digital Technologies Aid or Hinder Productivity?

As individuals are exposed to increasing amounts of web-based content the question lingers as to whether or not digital technologies aid or hinder a person's productivity. There are many arguments for either side of this debate, mostly hinging on the effective management of information and the ways in which we can harness online tools to better organise the flow of data between the web, and us.

The
Radicati Group (2007) predicts that by 2009 we'll spend 41% of our time managing emails. This startling statistic brings to light the reality that we as online participants in multiple communities and subscribers to a plethora of information sources, have the potential to be overwhelmed by our desire to be 'connected'.

To manage the immense packs of data which users seek, a variety of online tools have been developed to minimise the amount of time wasted through trawling through this data on a daily basis. Between reading industry newsletters, blogs, breaking current affairs and personal emails, time usually used for productive means is now being largely wasted. Of course there is the argument that to be ahead of our field, one has a duty to be informed - and this is where both sides of this debate converge.

Merlin Mann has articulated a range of ways in which our attitudes, use of tools, and development of skills can be used effectively to empty your email inbox in the quickest way possible. His personal blog called
Inbox Zero, lists the benefits of strategies such as:


  • Managing email filters - to separate information
  • The importance of deletion - to clear out useless information
  • The scheduling of 'email dashes' - to set out blocks of time to deal with emails

Although seen as extreme behaviour by some, these tactics employed and explained by Mann are being adapted worldwide to deal with the mass overload of online information. This is particularly pertinent in the workplace.

Another basic example of managing the flow of information is seen in the Google Reader facility which acts as a collector of all information subscribed to through RSS feeds. This saves users time through being able to search for specific terms using their own personal database of information providers.

Book-marking is also a way of tagging articles and web pages of interest and is becoming increasingly popular due to the ease in which users can tag relevant articles without having a large interruption to their web surfing. Sites such as
del.icio.us and Instapaper can be used to tag online content so it can be categorised and referred to at a later time.

Creators of these online tools understand the innate human need to expose themselves to multiple sources of information while realising the capabilities that specially created tools can lead to the effective management of personal information flow. It is the rise of these tools which therefore prove that with time, the adaptation of digital technologies can aid productivity, if easily integrated into daily life.

References

The Radicati Group, Inc. 2007. Addressing information overload in corporate email: the economics of user attention. http://www.seriosity.com/downloads/Seriosity%20White%20Paper%20-%20Information%20Overload.pdf (accessed May 7, 2008).

Thursday, May 1, 2008

How is Open Source Work (as an example of community produsage) different from Commercial Production?


The key characteristics of open source programming enable it to liberate consumers through providing them access, giving them the power to collaborate and build on the work of others.

The Four Freedoms, as defined by Stallman, provide the framework to understand how open source software can be aligned with community produsage as opposed to commercial production, purely created for economic gain. In summary, the Freedoms describe the free software movement and the benefits that communities can gain from software architecture which encourages shared ownership. They are outlined below:

Freedom 0 - Personal motives
Giving publics the right to run software as they wish.

Freedom 1 - Helping yourself
Adapting the software for your needs.

Freedom 2 - Helping your neighbour
Allowing for the copy and distribution of the updated software.

Freedom 4 - Helping your community
Freedom to improve the software and redistribute this for the overall benefit of the community.

The key assumptions underpinning the open source framework are that; everyone has a contribution to make, experimentation is encouraged, people will contribute if it is easy and beneficial to do so, and most importantly - that shared ownership is established.

Through sharing ownership, the commercial goals traditionally aimed for by proprietary software developers cannot be achieved, with open source software providing a free alternative to informed and proactive online communities.

Sunday, April 27, 2008

What are the differences between Commercial Production and Community Produsage?

The tug-of-war between commercial production and community produsage is hinged on identifying the underlying conflicting motivations behind each of these concepts. When addressing this question one must first define the terms involved.

Commercial Production in this argument will be discussed with its foremost intention being to create production possibilities with the potential to incite future economic growth. The basic premise of 'making money' is what fuels commercial behaviour when making choices related to production and consumption methods, and deconstructing commercial motivations will provide a framework in which it can be compared to community produsage.

Community Produsage is the alternate juxtaposition, where, as conceptualised by Axel Bruns, "participants are engaged in a collaborative and continuous building and extending of existing content in pursuit of further improvement" (The Future Is User-Led: The Path towards Widespread Produsage). It should be noted that the motivations behind community produsage are far more altruistic than those of commercial production, with the sharing of knowledge, ideas and concepts a common occurrence between communal network peers.

Some key ideas to be analysed when discussing the relationship between these two frameworks are:

  • Issues surrounding intellectual property
  • The commercial vs. the produser 'value chain'
  • Value creation and commercialisation and
  • The effects of open source software development

CurrentTV and V-CAM (Viewer Created Ad Messages)

An example which harnesses the intersection of commercial production and community produsage is demonstrated through the user-generated Emmy award winning content-based station Current TV, (http://www.current.tv/) premised on ‘citizen journalists’ uploading their own news stories via the web to be shared. In a two sided attack on commercial media producers, a notable feature of Current TV is where it pays users to create advertisements – called V-CAM (Viewer Created Ad Message). CurrentTV have also in the past allowed companies to be used as subjects for advertising competitions where the winning execution is consequently screened on the network. A press release from CurrentTV details the story of 19 year old creator and winner for a V-CAM developed advertisement for Sony Bravia titled “Transformation”, and on a commercial note it should be mentioned that each ad screened on CurrentTV earns the creator $1 000. The potential to earn money increases up to $50 000 however if the company involved decide to use the advertisement through other media channels.

By taking advantage of community produsage, advertisers are finding ways to capitalise on the benefits of having "viable alternative(s) to commercial products" (Bruns, 2008, p. 62), only truly revealed through the flexibility of digital media.

The negative aspects of these power shifts is felt directly in the media and advertising industries, as their service becomes under threat by emerging, empowered, and proactive produsers. This New York Times article titled An Agency’s Worst Nightmare: Ads created by users details the uncertain feelings within the industry as provoked by this particular CurrentTV advertisement, and the big brands following Sony’s lead – such as Toyota, American Express and L’Oreal.

The differences between commercial production and community produsage will be discussed further in the coming weeks; however this introduction has outlined how commercial motivations are focused on economic gain, exposing many opportunities to be threatened by the growing power of wide-spread community produsage models.

References

Bosman, J. 2006. An agency’s worst nightmare: Ads created by users. New York Times. May 11. http://www.nytimes.com/2006/05/11/business/media/11adco.html?ex=1305000000&en=0488b91d695a5873&ei=5090&partner=rssuserland&emc=rss (accessed May 15, 2008).

Bruns, A. n.d. The Future Is User-Led: The Path towards Widespread Produsage. http://produsage.org/files/The%20Future%20Is%20User-Led%20%28PerthDAC%202007%29.pdf (accessed April 23, 2008).

Thursday, April 17, 2008

How is Web 2.0 different from Web 1.0?

Ross Mayfield, the CEO of SocialText simplifies the differences between Web 1.0 and Web 2.0 by stating:


"Web 1.0 was commerce. Web 2.0 is people."

Expanding on this simple definition, one can observe that the sheer proliferation of Web 2.0 based sites can be looked to as the key indicator of the global shift and preference towards this platform of delivery.

The concept of Web 2.0 was first put forward by Dale Dougherty the VP for media and communications company, O'Rilley, in a brainstorm with MediaLive International (http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html).

The following image found at http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html is a graphical representation of the main concepts which underpin the Web 2.0 interface.

Snurblog - Axel Bruns