The core issue of collaborative editing, that of accuracy and trust, has reached a point in debate where research is needed to advance the practice of content use and development. Hiawatha Bray of the Boston Globle offered a Wikipedia criticism in July, calling it One great source — if you can trust it:
For it lacks one vital feature of the traditional encyclopedia: accountability. Old-school reference books hire expert scholars to write their articles, and employ skilled editors to check and double-check their work. Wikipedia’s articles are written by anyone who fancies himself an expert…
“I think it’s exactly the right price,” said Michael Ross, senior vice president of corporate development at Encyclopaedia Britannica Inc. in Chicago. Major articles in Britannica are signed by the author; all articles are vetted by an experienced team of editors and scholars. The libraries that pay $1,500 for a set of bound volumes or the family that pays $60 a year for an Internet subscription are buying confidence as well as information. … Ross admits to reading and enjoying Wikipedia, and has even gotten ideas there for future Britannica articles. But the absence of traditional editorial controls makes Wikipedia unsuited to serious research. “How do they know it’s accurate?” Ross asks. “People can put down anything.”..
In 2002, Wikipedia was criticized because it couldn’t scale and have in-depth articles. Turns out that more was put down than expected, surpassing the Britannica.
Hiawatha raised a key issue, that of quality and reputation, and his piece highlighted Wikipedia’s ambition to publish a first print version. Coupling emergent content development and formal editorial process is a very competitive business model for print. But if the public learns to use and trust the content that emerges in Wikipedia as an authority, it is even more disruptive.
This week Al Fasoldt, a Post-Standard Columnist in Syracuse NY claimed Wikipedia is untrustworthy, based upon an interview with a high school librarian:
“As a high school librarian, part of my job is to help my students develop critical thinking skills,” Stagnitta wrote. “One of these skills is to evaluate the authority of any information source. The Wikipedia is not an authoritative source. It even states this in their disclaimer on their Web site.”
Wikipedia, she explains, takes the idea of open source one step too far for most of us.
Mike from Techdirt takes the columnist to task for misunderstanding Wikipedia:
What’s most amusing about this fear mongering piece concerning Wikipedia is that the librarian in question claims that she uses Wikipedia as an example of an “untrustworthy” site in trying to teach students to develop critical thinking skills. If that’s true, she’s doing a dreadful job. If they really wanted critical thinking skills, shouldn’t they do more than trust this uninformed librarian, but do a little research about Wikipedia itself, how it works, and how the power of Wikipedia is the fact that it is edited — but by anyone else using Wikipedia? There’s just something that seems to freak people out about Wikipedia, when they can’t fathom the idea that “the masses” could produce something of value by simply being able to correct each other, allowing them to build something much more beneficial and much more useful than an expensive encyclopedia edited by just a few people.
Mike took another step of contacting the reporter, and the exchange led him to ask, whom do you trust, the wiki or the reporter?
The quality of Wikipedia Articles, at the very least, at a moment in time are better than they were before and will improve over time. Mike offered a Techdirt Challenge: I pointed to the Wikipedia page on Syracuse, NY where he apparently lives, and suggested he change something on the page, to make it provably, factually incorrect — and see how long it lasted. Alex Halavais, for one, is taking the Challenge. While the results of the challenge (update: 13/13)will provide some valuable insight, it lacks an untampered collection methodology and introduces unfair costs to the system itself.
Joi Ito rightly condemns Mr. Fasoldt’s assertion and views this issue as traditional vs. collective authority:
In fact, on very heated topics, you can see the back and forth negotiation of wordings by people with different views on a topic until, in many cases, a neutral and mutually agreeable wording is put in place and all parties are satisfied. Traditional authority is gained through a combination of talent, hard work and politics. Wikipedia and many open source projects gain their authority through the collective scrutiny of thousands of people. Although it depends a bit on the field, the question is whether something is more likely to be true coming from a source whose resume sounds authoritative or a source that has been viewed by hundreds of thousands of people (with the ability to comment) and has survived.
Shelley Powers delves into the issue of truth and authority:
The reason, according to those with more modern views, is though the authors could be considered ‘authorities’ on the topic, they don’t have the ‘truth’ because the truth, in this instance, is held by those who have new, and fresh insight into the existing material–they have reached an epiphany the others, weighed down by the mass of research material and outdated ideas, can’t hope to achieve.
According to these blessed with such insight, they have truth without authority, while the historians have authority, but can’t possibly understand the truth. Who you trust then, depends less on authority or even truth than it does on who you want to believe–literally whose interpretation rings your bell the most.
The Manifesto for the Reputation Society describes Wikipedia as reputation for the community as a whole by helping to create a public good where there is more flexibility as reputation and other motivations substitute for direct reciprocity. As the Manifesto hints, Wikipedia is considering codification:
An item of debate within the Wikipedia community is the degree to which contributors should acquire some form of reputation, which might then be used to make their contributions to the encyclopedia harder to modify. Letting reputation of contributors emerge in a transparent manner will reward higher–quality contributions, and may provide a partial answer to coordination problems if those who make good contributions receive some proportionate ability to decide conflicts. However, the contrary point of view argues that it is the very openness of Wikipedia that made it a success. One suggestion that balances both points of view is to keep the full Wikipedia open, but to use a reputation system to highlight entries that will be periodically copied into an unmodifiable backup; more ideas can be found in the online discussion of a Wikipedia approval mechanism (WikiApproval
Which brings me to an lingering thought — that explicitly codifying reputation introduces a cost which can constrain commons-based peer production. Wikipedia was never supposed to work, somehow does because of good club theory and transaction costs, and has gained a reputation as a resource. Introducing reputation for contributors or articles is the greatest risk to the Wikipedia community. Getting a base study on factual accuracy can help inform this decision as well as educate the public on how to use and participate with this commons resource.
I’ve been quitely forming a group of journalism schools, media centers and experts to engage in the Wemedia Project, which begins with a formal Wikipedia Article fact checking excercise and publishing findings. The USC Annenberg Center has already announced their support and next month we will begin the collaborative research process within a Socialtext Workspace. Without getting into defining truth, you can separate issue of fact, value or policy. The approach is to apply a formal fact checking process to a sample of articles to gain a baseline measure of factual accuracy and explore issues of reputation.
More to come, suggestions appreciated.