In light of the preceding post I think it's also worth mentioning the Instant Commons as well. Another great hack around the MediaWiki software that will allow any MediaWiki installation access and usage of any uploaded media file from the Wikimedia Commons. Instant Commons-enabled wikis will cache Commons content so that it would only be downloaded once, and subsequent pageviews would load the locally existing copy rather than the Wikimedia foundations servers. If this could be combined with some localized search interface ... perhaps using Grub to index Commons content ... now we're talking.
There's an test of Instant Commons available here.
Inevitably one of the first queries that arises during conversations about authoring content using wikis is the question of how to ensure quality. Generally the response wanders around the nature of the wiki-way and collaborative development models - ie, the community will create quality in articles over time, vandalism will be removed and, for Wikipedia at least, Neutral Point of View (NPOV) will be adhered to by authors and editors.
The scale and increased importance of a source like Wikipedia though has meant that while this model of creating quality actually works quite well, there is little indication of it within articles themselves besides perhaps taking a look at their history over time and inferring from that at least a level of interest and support for the creation of a quality resource.The Wikimedia foundation though has decided it's time to tackle this growing issue head on, and to enhance the MediaWiki platform and Wikipedia with software to assist in the designation and evaluation of articles for quality. Wikimedia Quality has been created as a portal to discuss current work in this area and to solicit thoughts and feedback from interested parties.
Currently there are two projects that are investigating this area. The first is an extension to the MediaWiki software called FlaggedRevs. The goal of this revision tagging tool is to allow a subset of editors to identify the most recent version of an article that has been checked for vandalism, or even gone through an in-depth review process.The second is called Article Trust and is being undertaken by Luca de Alfaro is an Associate Professor of Computer Engineering at the University of California, Santa Cruz. Cruz's research studies the patterns in Wikipedia article histories and his team has created software which colorizes Wikipedia articles according to a value of trust, computed from the reputation of the authors who contributed and edited the text.
I think that also what is important to note in the development of this initiative is how the Wikipedia project is now at a point where resources and time are being applied to tasks around future scalability of this massively significant project. Tackling the quality issue is a great place to start and I will be following it with keen interest. Photo: http://www.flickr.com/photos/criminalintent/339344589/ (CC-By-SA)