Wiki Work of the People

From Neal's Wiki
Revision as of 16:05, 20 November 2015 by Iraneal (Talk | contribs)

Jump to: navigation, search

If liturgy is the work of the people, I can think of no greater 21st century collaborative work than Wikipedia, the online encyclopedia that anyone can edit. At well over 5 million total articles and an average increase of 20,000 articles per month, it is the largest collection of encyclopedic knowledge ever assembled by human beings (It claims 26 million contributors, not counting anonymous contributions).

Wikipedia has its detractors, of course. Since *anyone* can edit it, how accurate could it possibly be? In it's mere 14 years of existence, many scientific studies have undertaken to answer that question (you can find them readily enough via Google search on "Wikipedia reliability"). They generally conclude that the encyclopedia is about as accurate as traditional printed encyclopedias. However, Wikipedia is able to quickly correct innacuracies, while printed encyclopedias must live with their errors until the next printing. Professors, teachers and librarians are all among the most vociferous critics of Wikipedia, but my own experience as a former High School English teacher suggests that these same academics are also among its most frequent users and contributors.

My interest here, however, is not so much with Wikipedia's accuracy or the virtues of using its content in research papers--rather it is with the structure and ethos of Wikipedia that enables it to be such a monumental "work of the people." In an era where 9 out of the top 10 most visited websites in the world are owned by large corporations and authored by professional public relations teams, Wikipedia stands out as the only top 10 website controlled by a non-profit foundation, and authored by a gigantic cadre of enthusiastic volunteers. I think the Church, especially in its worship, could stand to learn a few things from Wikipedia's approach.

We'll start with the issue of security, since it's one of our biggest cultural phobias at the moment. Traditional security revolves around controlling access in order to prevent harm. It's why we have locks on our doors, walls on our borders, why our websites are password protected, and why only the Pastor and the worship committee decide what happens in the worship service. The underlying assumption is that while most people are well-intentioned, all it takes is one bad apple, one "intruder" to mess things up.

Wikipedia turns this notion upside down: Anyone can edit an article; there's no lock on the door. There are, however, hundreds of eyeballs on the door. Whenever I edit a Wikipedia article, instant notifications are sent out to all the people who have contributed to that article in the past, inviting them to see what I've done and verify that it is a good-faith edit and not simple vandalism. If it is indeed vandalism or patently false information, Wikipedia makes it easy to undo the spurious edit with a simple click of the button.

But what if my edit isn't spurious? What if it's just controversial? Wikipedia policy requires that when two editors have conflicting but well-supported veiwpoints, both are included in the article, allowing readers to judge for themselves. If the disagreement persists, it is taken off the main article to a back-channel (but still public) page for further discussion and public comment. Finally, if no agreement can be reached, the discussion is settled by third-party neutral arbitration.

All of these things contribute to Wikipedia's reputation as a pretty open environment. But the online encyclopedia has its share of closed aspects worth considering as well: While I can edit the content of any article, I can't edit the Wikipedia logo or mission statement. Nor can I edit the underlying software that protects the right of any user to edit articles. Wikipedia may have millions of editors, but ultimately the non-profit is governed by a 10-member board of trustees, elected and selected in various ways by its constituent communities.