Wikipedia Attempting to Quantify Truthiness

Color-coding will reflect cabal's consensus

By Matt Baume
|  Monday, Aug 31, 2009  |  Updated 12:09 PM PDT
View Comments (
)
|
Email
|
Print
Wikipedia Attempting to Quantify Truthiness

"Quartemane" via Flickr

A T-shirt design for the Wikipedia campaign.

advertisement

Coming this fall, Wikipedia will take a highlighter to its text, drawing orange warning-boxes around edits that it considers to be untrustworthy.

It's a big, important problem, since anyone can edit an article on Wikipedia, and the site's millions of users, rightly or wrongly, view its articles as authoritative. So how does the volunteer-written online encyclopedia distinguish trust from anti-trust?

The site's backers at the San Francisco-based Wikimedia Foundation say it will use a fancy algorthm called "WikiTrust" that assumes that content from experienced writers is more likely to be accurate. Wikipedia itself defines "trust" as "a relationship of reliance," and that's what WikiTrust accomplishes, without going much further: It delineates "trustworthiness" without making any warranty of "factuality."

Wikipedia has long struggled to cast off its reputation as a "wild west" of facts. Sometimes there's too much information, sometimes it's misunderstood, and sometimes sources are eliminated altogether.

Recently, site leadership proposed a lockdown on biographical entries -- an arragement that might've spared the site some post-Ted-Kennedy vandalism.

WikiTrust also raises questions of transparency, after Wikipedia editors took the unprecedented step of repeatedly deleting factual information from an entry of a kidnapped reporter so as not to endanger him. With that Pandora's Box now opened, can we trust WikiTrust?

Get the latest headlines sent to your inbox!
View Comments (
)
|
Email
|
Print
Leave Comments
Follow Us
Sign up to receive news and updates that matter to you.
Send Us Your Story Tips
Check Out