Wikimedia Launches New Program To Fight Online Harassment

Wikimedia Launches New Program To Fight Online Harassment


Did you know that 40% of internet users, and as many as 70% of younger users have personally experienced harassment online? And more than half of people who reported experiencing harassment also reported decreasing their participation in the Wikimedia community. And, Wikimedia's the real deal – we need as many voices at the table as possible.

Online harassment's a big problem that needs to be addressed. To ensure Wikipedia’s vitality, people of good will need to work together to prevent trolling, harassment, and cyber-bullying from interfering with the common good.

To that end, I'm supporting the work of the Wikimedia Foundation towards the prevention of harassment by donating $500,000, in part from the Craig Newmark Foundation, to advocate for a healthier and more inclusive Wikimedia community.

The money donated will support the launch of a community health initiative to address harassment and toxic behavior on Wikipedia through the development of tools for volunteer editors and staff to reduce harassment and block harassers.

Volunteer editors on Wikipedia are often the first line of response for finding and addressing harassment. "Trolling," "doxxing," and other harmful behaviors are burdens to Wikipedia's contributors, impeding their ability to do the writing and editing that makes Wikipedia so comprehensive.

The goal is to fund the initial phases of a program to strengthen existing tools and develop additional tools to more quickly identify potentially harassing behavior, and help volunteer admins evaluate harassment reports and respond quickly and effectively. These improvements will be made in close collaboration with the Wikimedia community to evaluate, test, and give feedback on the tools as they're developed.

This initiative addresses the major forms of harassment reported on the Wikimedia Foundation’s 2015 Harassment Survey, which covers a wide range of different behaviors, including:

– content vandalism,

– stalking,

– name-calling,

– trolling,

– doxxing,

– discrimination,

…and really, anything that targets individuals for unfair and harmful attention.

From research and community feedback, 4 areas have been identified where new tools could be beneficial in addressing and responding to harassment:

– Detection and prevention – making it easier and faster for editors to identify and flag harassing behavior.

– Reporting – providing victims and respondents of harassment improved ways to report instances that offer a clearer, more streamlined approach.

– Evaluating – supporting tools that help volunteers better evaluate harassing behavior and inform the best way to respond.

– Blocking – making it more difficult for someone who is blocked from the site to return.

This is really serious, and you can read more on the initiative here. How have you contributed to Wikipedia? And, will you be more inclined now that there will be more measures to prevent and address online harassment?



Charles Sanchez

Craig this is excellent news! Thank you so much for doing a very human, very sincere thing: sitting down to work on an actual solution to something, not just "throwing money at it." This is a real issue that is arguably not well represented by example in the current political landscape, setting exactly the wrong tone for our youth and the values and culture we try to espouse as a democracy and country.

This is truly awesome. Thank you!
Chuck S.

Sarah Hope

Online harassment is becoming a great concern and I say this was the uttermost of urgency. Organizations such as Wikipedia can perhaps help pave a path for a real real concern that quite personally has left me without many words to say. People need a safe haven. I cannot even begin to explain my journey, but am on risk mitigation control and had no idea it would lead to this. I need support, and there is absolutely no where to go.


This is truly great news. More effort is always better than less. I'm concerned that using the volunteer editors for anti-harassment efforts will not be the most productive strategy, however.

The VEs aren't harassing people, typically, but are likely to be a mostly monolithic group that does not necessarily experience or even see the type of harassment that others face. If they're the ones evaluating harassment reports, it will likely yield the result we see in many corporations–lots of ink is dedicated to programs, but harassment is still judged according to the standards of those who suffer the least (and least severe) of it and is too often dismissed or dealt with so inadequately that harassment continues.

Here is another time I hope I'm wrong. I'll be happy to be proved wrong as the new tools roll out and we can see hard data showing how effective they are.

Ellen Spertus

Thanks, Craig. This make me more likely to encourage my (mostly female) students to contribute to Wikipedia.

Comments are closed.