Tuesday, April 26, 2016

Leaked Information and Technology Needed to Process It

The leaking of information to the public and press over the past six years – specifically in the cases revolving around Chelsea Manning and WikiLeaks, Edward Snowden and The Guardian, and the Panama Papers and Mossack Fonseca – has resulted in a number of changes to the ways in which investigative journalism functions, especially within a democratic system. At their most basic level, such leaks have served as a reminder of sorts that journalists are not part of the state or its governing bodies. For example, in Journalism After Snowden: The Future of Free Press in the Surveillance State, Alan Rusbridger, the former Editor-in-Chief of The Guardian, states that a journalist’s job is, quite simply, to disclose: “You stand aside from power in order to scrutinize it. Your job is to be fully sensitive to all the public interests raised by the story – and to publish what you judge to be significant as responsibly as you know how. Only then is informed debate possible.” But, when leaks come to them in vast troves of data that might be incredibly obscure, jumbled, and difficult to decipher, how might journalists adapt in order to accommodate the clandestine operations of whistleblowers arriving on their doorsteps (or, rather, their inboxes)?

Leaking the information to other parties to sift through, instead of just revealing everything to the public at large, in an attempt to eliminate personal bias is certainly admirable. This, of course, is a well-known rationale in the case of Edward Snowden. But, sifting through that data – especially something as large as the 2.6 terabytes of material comprising the Panama Papers – creates a step that journalists haven’t really been accustomed to ever before. The Panama Papers present a unique situation in another way as well: hidden agencies and secretive maneuvers of officials and leaders are being revealed by a hidden sort of agency as well (the source remains anonymous). The “if you have nothing to hide then there’s nothing to fear” governmental adage can easily be spun to put pressure on state-sponsored spying and monitoring in this regard: if they have nothing to hide – national security matters aside, however tricky this qualification may be – then why covertly engage in these practices? The obvious assumption by governing powers is that they can have their secrets and the public cannot. And this is exactly the sort of scrutiny Rusbridger is calling on among his fellow journalists.

But, in order to accurately and delicately report on such sensitive matters, journalistic strategies and tools must adapt to the circumstances of the digital age. Technological innovations are being explored to help journalists sift through such vast amounts of data. For example, a program like Tabula is being explored for its ability to help convert PDF documents into more manageable files that can be organized and searched with more efficiency, essentially allowing large collections of searchable documents to “network” together during the research process without risking the disclosure of privileged data or resources. When the information is so vast – compare, for instance, the 1.7 gigabytes of data in Manning’s leak, or the estimated 60 gigabytes of Snowden’s, to the 2.6 terabytes of the Panama Papers – such strategies truly become invaluable resources in processing large amounts of information in order to coherently disclose it to the public.

1 comment:

  1. Yes this twist on secrecy is interesting. And the new practices that are unfolding in reporting on leaks are really still under-explored (and probably under-articulated). Nice points.

    ReplyDelete