
Type any topic into a search engine. A Wikipedia entry is likely to show up in the results, written by people with an interest in the topic. Wikipedia is the most ubiquitous example of “peer production” websites, which are sites dependent on a community of unpaid content contributors.
Built through the mass aggregation of small contributions, such sites have tremendous potential. But they face daunting challenges as they become more successful.
UW professor Benjamin Mako Hill studies peer production sites, looking at why some succeed while most fail, and how even successful sites struggle to maintain one of their central tenets — being open to new contributors. Hill is a professor of communication in the College of Arts & Sciences.
With grant support from the National Science Foundation, Hill’s research team has examined the lifecycle of peer-production sites, from various Wikipedia editions (there are wikipedias in more than 300 languages) to fandom sites to sites dedicated rock climbing, open-source software, and many other interests. They specifically looked at the lifecycle of content contributions, the lifeblood of any peer-production site. And what they found was puzzling.
A Pattern for Peer Production
Despite their varied sizes and topics, successful peer-production sites tend to have surprisingly similar lifecycles. They begin with a steep rise in content, and then — after five to seven years — experience a gradual decline in new content even as viewership continues to rise.
One might hypothesize that as sites become robust, potential contributors find the easy tasks have already been handled, leading to the decline. Hill has dismissed that theory. In fact, he says, attempts to submit or edit content continue to increase. What changes is the difficulty of the submission process.
“These peer production communities start with the goal of getting people in the door, getting them to contribute content, and their doors are wide open,” Hill says. “Most of these communities never achieve that goal in a meaningful way, but on the off chance that they are successful, over time more people to want to contribute to the site. That’s great, more facts are coming in, but now there’s also a problem, because some people want to screw with the content in various ways. They want to mess things up.”
Hill offers a few examples. There are contributors who simply find it funny to mess with published content — the web version of adding devil’s horns to a poster at a bus stop. Others are more manipulative, trying to scrub any negative information about them or their company from a site. Even retailers have found clever ways to game the system. With Wikipedia at the top of most search engine results, North Face, an outdoor recreation products company, removed existing photos of recreation destinations on the site, replacing them with similar photos of the destination that include someone wearing North Face clothing. The retailer’s PR firm even made a video extolling the swap as a clever marketing tactic.
Political movements have also attempted to infiltrate wiki sites. The most disturbing example was the takeover of the Croation Wikipedia by a group of far-right nationalists. “They completely took control of it, had all the administrative positions, and banned people who disagreed with them,” Hill says. “It took the Wikipedia community ten years to fix that situation.”
Most bad-faith contributors are less extreme, but they can still damage a website’s reputation for reliability and accuracy — a reputation that likely took years to build. So it’s understandable that Wikipedia and other successful sites would start making it harder to contribute.
“Raising the barrier to contribution is a legitimate and rational way of keeping out the bad stuff,” says Hill. “But the effect is that the communities become closed. The thing that caused these communities to become so effective in the first place — their openness — becomes threatened.”
Losing the Good with the Bad
Many peer-production sites use filters to block unwanted content. Some sites require would-be contributors to create an account and have it approved before they can submit content. But these approaches have consequences.
“Changes that are really effective at blocking the bad stuff also block the good stuff,” says Hill. “And it’s hard to see the good things that are not happening. The relative invisibility of all the people who are not showing up is a major challenge.”
Raising the barrier to contribution is a legitimate and rational way of keeping out the bad stuff, but... the thing that caused these communities to become so effective in the first place — their openness — becomes threatened.
Hill recalls talking with a project manager of a fandom site who was proud of changes the site had made to address unwanted content. A subsequent analysis by Hill’s team found that while more than 70 percent of bad content was prevented from that site, there was also a 20 to 40 percent decrease in the amount of good content — in ways that were much harder to see. No one in the community had been aware of that.
When Hill started this research, he was skeptical of any attempts to limit content contributions. He saw that as antithetical to the tenets of peer production and likely to hurt to projects. He thinks about it differently now.

“I’m not arguing for complete openness,” he says. “My position is that there’s a set of tradeoffs we need to make, so how do we make the smartest tradeoffs? How do we close off the bad stuff while minimizing the impact on our ability to grow and sustain these really valuable information goods?”
On his research team’s peer-produced site, Community Data Science Collective, Hill has been developing more nuanced filters to avoid declining good content along with the voluminous amount of bad content that floods the site. He also sees potential for machine learning systems to detect obvious vandalism and automatically undo it. He continues to study approaches used in a range of communities, to learn from those that most effectively balance openness with protection of the site’s value and reputation.
Hill is also noticing new challenges on the horizon, most notably the trend of AI content being listed first in web search results, ahead of Wikipedia — a particularly galling development given that generative AI is built using sites like Wikipedia that provide freely available content.
“What this has meant is that for the first time in history, Wikipedia is beginning to see decreased viewership,” Hill says. “In some ways, these AI companies are just kicking the legs out from underneath the table they’re eating off of. Everyone is hoping we can come up with a solution, but I think this is a big challenge — and another area of research to be explored.”
More Stories

Through Chemistry and 3D Printing, New Materials Emerge
Chemistry professor Alshakim Nelson and his research team use 3D-printing technology to develop new materials with potential real-world applications in medicine, engineering, and sustainability.

An Earful of AI
Hearing aid technology is improving all the time with the help of AI, thanks to researchers like Yi Shen, professor of speech & hearing sciences.

The Evolution of the Book
As books evolve with new technologies, Geoffrey Turnovsky, professor of French, explores the history of texts — and the reading experience.