On Feb. 17, an unknown source smuggled a “battlefield video” out of the Democratic Republic of Congo and published it across social media. The video appears to depict Congolese soldiers opening fire on a group of unarmed civilians singing in a roadway and then shooting the wounded in the head. News organizations have stated that the authenticity of the video—even the time and place it was recorded—could not be independently verified.
Congolese Communications Minister Lambert Mende initially suggested the video is a fake, calling it a “clumsy and ridiculous” montage of images compiled by anti-government expats in Belgium, Congo’s former colonizer. Meanwhile, Congolese Human Rights Minister Marie-Ange Mushobekwa said that the nation’s human rights, interior, justice, and defense ministries are working together “to try to authenticate these images.” The U.N. and Human Rights Watch are also reported to be investigating the video.
That’s good. But for these investigations to succeed, forensic scientists working on all sides must ensure that their methods are open, transparent, and reproducible.
We’ve been here before. A similar video depicting Sri Lankan government soldiers shooting bound, naked prisoners-of-war was smuggled to Europe and broadcast by the U.K.’s Channel 4 News in 2009. In that case, U.N. scientists said they believed that the video was authentic evidence of a war crime.
But Sri Lankan government scientists claimed that it was a fake produced by the Tamil Tigers to discredit the Sinhalese ruling regime. Sri Lanka has a long history of Tamil-Sinhala conflict, and many inside the country chose to believe the scientists who shared their ethnic and religious identities, instead of the U.N. researchers. Science became a tool to foment distrust, rather than a platform to support consensus and reconciliation.
Done right, science is open, reproducible, and has the potential to transcend political chasms. When our methods for reaching a particular conclusion are open to scrutiny, we have a chance to convince people who are not already primed to trust us.
Shortly after the 2009 video was leaked, I arrived in Sri Lanka as a Fulbright senior research scholar. Along with an independent digital forensic scientist, I audited the investigative reports on both sides of the dispute. We found that both the Sri Lankan and the U.N. reports contained obstacles to transparency and reproducibility. U.N. researchers used a proprietary software program to conduct unnecessarily complicated procedures when they could have reached the same results with simpler, more easily replicated methods. They also failed to run a cryptographic hash—a kind of digital fingerprint—that would have enabled subsequent researchers to verify that they had an identical copy of the video file. Sri Lankan government researchers, on the other hand, actually implied that their own experimental procedures, techniques, and results were a kind of intellectual property that required “appropriate permission” to redeploy.
When forensic scientists refuse to reveal details about how their experimental methods work, they erode trust in the ideal of scientific objectivity, and in the legitimacy of their results. There is already a dearth of trust surrounding forensic sciences. Just last fall, President Obama’s Council of Advisors on Science and Technology reported that even some long-practiced forensic disciplines, like bite-mark analysis and some methods for analyzing complex mixtures of DNA, are not foundationally valid.
Forensic video analysis is a relatively new field with unique challenges—chief among them, the need to protect the anonymity of sources who document atrocities on video. As with the Sri Lankan video, the initial source of the Congolese footage likely must remained unknown due to safety concerns and fear of reprisal. As a result, the scientific analysis of the footage itself must carry greater weight.
Groups like WITNESS and the Guardian Project have sought to develop technological solutions to this problem by using metadata to authenticate video evidence while protecting the identity of its source. Amnesty International’s Digital Verification Corps, and Berkeley’s Human Rights Center are developing techniques to verify social media videos, both for journalistic needs and for legal evidence, by cross-referencing elements of the image itself with other known information. These are crucial developments.
The solution for the Congo video, though, is not technical but procedural. Investigations into this video will unfold amidst extended conflicts between religious, rebel, and government actors. Nor are international organizations neutral here—last spring, Mende called U.N. peacekeeping missions in the country a form of “neo-colonization.” To build trust, investigators should rely on the simplest possible forensic methods, publish a cryptographic hash, and invite other researchers to replicate their methods and reproduce their results.