Wiki Work of the People

From Neal's Wiki
Revision as of 16:49, 20 November 2015 by Iraneal (Talk | contribs)

Jump to: navigation, search

If liturgy is the work of the people, I can think of no greater 21st century collaborative work than Wikipedia, the online encyclopedia that anyone can edit. At well over 5 million total articles and an average increase of 20,000 articles per month, it is the largest collection of encyclopedic knowledge ever assembled by human beings (It claims 26 million contributors, not counting anonymous contributions).

Wikipedia has its detractors, of course. Since *anyone* can edit it, how accurate could it possibly be? In it's mere 14 years of existence, many scientific studies have undertaken to answer that question (you can find them readily enough via Google search on "Wikipedia reliability"). They generally conclude that the encyclopedia is about as accurate as traditional printed encyclopedias. However, Wikipedia is able to quickly correct innacuracies, while printed encyclopedias must live with their errors until the next printing. Professors, teachers and librarians are all among the most vociferous critics of Wikipedia, but my own experience as a former High School English teacher suggests that these same academics are also among its most frequent users and contributors.

My interest here, however, is not so much with Wikipedia's accuracy or the virtues of using its content in research papers--rather it is with the structure and ethos of Wikipedia that enables it to be such a monumental "work of the people." In an era where 9 out of the top 10 most visited websites in the world are owned by large corporations and authored by professional public relations teams, Wikipedia stands out as the only top 10 website controlled by a non-profit foundation, and authored by a gigantic cadre of enthusiastic volunteers. I think the Church, especially in its worship, could stand to learn a few things from Wikipedia's approach.

We'll start with the issue of security, since it's one of our biggest cultural phobias at the moment. Traditional security revolves around controlling access in order to prevent harm. It's why we have locks on our doors, walls on our borders, why our websites are password protected, and why only the Pastor and the worship committee decide what happens in the worship service. The underlying assumption is that while most people are well-intentioned, all it takes is one bad apple, one "intruder" to mess things up. And so we lock out all but a select few, and in the process keep out a lot of good people with good contributions to make.

Wikipedia turns this notion upside down: Anyone can edit an article; there's no lock on the door. There are, however, hundreds of eyeballs inside the door. Whenever I edit a Wikipedia article, instant notifications are sent out to all the people who have contributed to that article in the past, inviting them to see what I've done and verify that it is a good-faith edit and not vandalism. If it is indeed vandalism or patently false information, Wikipedia makes it easy to undo the spurious edit with a simple click of the mouse. This is a subtle shift of resources: Rather than spending great amounts of energy policing the front-end (doors, walls, passwords), Wikipedia uses its community to police the back-end, or actual contributions. Positive contributions are encouraged, while negative contributions are quickly minimized or redirected.

But what if my edit isn't spurious? What if it's just controversial? Wikipedia policy requires that when two editors have conflicting but well-supported veiwpoints, both are included in the article, allowing readers to judge for themselves. If the disagreement persists, it is taken off the main article to a back-channel (but still public) page for further discussion and public comment. Finally, if no agreement can be reached, the discussion is settled by third-party neutral arbitration.

All of these things contribute to Wikipedia's reputation as a pretty open environment. But the online encyclopedia has its share of closed aspects worth considering as well: While I can edit the content of any article, I can't edit the Wikipedia logo or mission statement. Nor can I edit the underlying software that protects the right of any user to edit articles. Wikipedia may have millions of editors, but ultimately the non-profit is governed by a 10-member board of trustees, elected and selected in various ways by its constituent communities.

So what would a Wikipedia approach look like in our liturgy, our own "work of the people?"

It might begin with the assumption (on the part of the pastor and worship committee) that everyone in a congregation has something to contribute to the worship service--most contributions will be positive, and the few that aren't can be quickly redirected with help from the community. I once was part of a congregation where the wireless microphone was passed around for prayer requests as well as for announcements. I've also participated in worship services where hymns were requested and selected on the spot by congregation members. And I have led at least one or two worship services where the sermon was replaced with a panel discussion around an issue or scripture passage. I know pastors who reserve time at the end of the sermon for questions and discussion.

This, of course, might sound frightening to some. Will there be any focus? Will the service run three hours long? What if people start arguing? Here again, Wikipedia provides guidance.