Engaging students and the public at large with new technologies applied to human rights investigations has an enormous potential. One that is just starting to be tapped into. This is one of the many takeaways from the panel “Amnesty International’s Digital Verification Corps” held at the International Journalism Festival on 6 April.

How Amnesty International adapted to the ‘new world’ to engage the mass and collaborators

Milena Marin, Senior Innovations Campaigner at Amnesty International, kicked off the session looking at two key challenges that Amnesty is facing these days. Firstly the public is exposed to an overload of information, with the staggering number of 46,000 videos currently available on YouTube only, as Marin reported. Secondly, and linked to the first challenge, it is engaging the audience is becoming more and more difficult. To tackle these challenges Amnesty has developed two kind of projects: a first stream that takes a mass approach trying to engage a wide audience; a second, more targeted one, that gets to work closely with a limited number of volunteers.

Within the former approach, Amnesty launched Amnesty Decoders, and the panel focused in particular on Decoder Darfur. Simply put, Decoder Darfur is programmed to ask the audience to map destroyed villages in Darfur in a highly engaging manner: Amnesty provides a squared satellite map of Darfur, covering an area roughly as big as Spain, and invites the audience to navigate the map, spotting and pointing at inhabited areas (by simply clicking on the square where there are buildings). The audience is then asked to compare the state of those villages in different periods in time and flag whether the village outlook has remained untouched, or if it was damaged or destroyed by the ongoing conflict in Darfur. The project turned out to be extremely effective: the overall amount of time spent by users to map Darfur villages corresponded to one person working full-time for four years. This is something that Amnesty apparently could not have achieved by its own means in such a short amount of time.

The Digital Verification Corps, which was the main focus of the panel, is instead a prime example of a more in-depth engagement with the audience. The project was introduced briefly by Sam Dubberley of Amnesty. The Digital Verification Corp entails having university students searching through social media to identify, and verify claims of human rights abuses across the world.

In the past, as the panellists explained, finding out about human rights abuses  required, in addition to research, an activist possibly travelling to crisis areas and physically digging the ground to find potential signs of mass graves or other evidence of abuse. Nowadays, plenty of human rights abuses are documented online with videos and other reports. This allows for long-distance research, and immediate digging through video databases uploaded directly from crisis areas. This is exactly what groups of students at UC Berkeley have been doing in the last couple of years.

From search to verification: Check, a new tool to verify online content  

Nowadays the challenge, rather than in the search, lies perhaps in the verification of the material available ‘out there’. On the panel, Ed Bice, CEO of Meedan, presented Check, a project launched by the journalism tools builders in 2011, as a toolkit for verifying online news, for instance during elections. How does it work in practice?

Taking a mainstream example, Check allows a student, or a journalist to strip out a tweet, and assigning it a task or note, e.g. saying that it is not 100% reliable, and needs to be double-checked. Check would then provide a checklist to use as a sort of blueprint for verification, which looks inter alia at the provenance of the investigated content (original/scrap), at the issues involved (human rights abuse, violation, etc.) By using standard checklists Check ensures consistency, which helps not only different users in the short-term investigation, but also facilitates data aggregation and classification in the long-term. Once verified, a colour status can be assigned to the post (similarly to a traffic-light system) and then the content [the tweet in this example] is labelled, for instance as ‘fake’ and displayed or reported as such.

A win-win situation to be replicated: adding value for students, investigations, and human rights  

UC Berkeley was the first university to partner with Amnesty for the Digital Verification Corps. To Alexa Koenig, Director of the Human Rights Center at UC Berkeley, the value added by the University, is the variety of skills, and languages, that the students of different faculties can contribute to NGOs for a set period of time (otherwise NGOs would have to seek for short-term contracts).

When the project started, Koenig expected it would only attract a handful of students: the project now involves around 60, with 18 different languages spoken among them, and whose work adds up to 6,000 hours of investigations.

Moreover, the project has had the merit of bringing students from very different academic bacgrounds, from computer scientists to lawyers, closer to human rights. As a next step for Berkeley, Koenig already started cooperating with two human rights law firms to try and develop international standard procedures for investigations closer to Amnesty’s Digital Verification.

Dubberley confirmed that the Digital Verification Corps exceeded even Amnesty’s expectations by far, and that the organisation is open to the opportunity of further spreading the project. Other universities, such as ones in Pretoria, Toronto, and Essex, are lining up to either join or replicate the project.

The possible uses are also expanding: Dubberley mentioned that Amnesty is currently using it to check the reports of the recently reported chemical weapons attacks in Syria. Bice joined the other panellists in concluding that these are truly win-win stories: where the applications developed (as Meedan) gain usability reviews (for free), NGOs see their activities progress significantly thanks to skillful volunteers, and students acquire hands-on experience of new techs, human rights knowledge and, last but not least, human emotions.