The browser or device you are using is out of date. It has known security flaws and a limited feature set. You will not see all the features of some websites. Please update your browser. A list of the most popular browsers can be found below.
An imam dies in a U.S. drone strike days after preaching against Al-Qaeda. American missiles wipe out a carload of shoppers, including a mother and her 10-year-old daughter. A wedding is ruined when a drone attack kills a dozen guests.
These are some of the most well-documented cases of civilian deaths from President Barack Obama’s secret war in Yemen. In each instance, multiple witnesses with broadly consistent accounts have told severalindependentparties what they saw. When media attention to one of these stories reaches a certain pitch — as the December 2013 wedding convoy strike recently did — the administration typically issues a spare, unattributed denial without further explanation or any contradictory evidence.
The short answer, it appears, is that an overweening faith in surveillance and Big Data has come to exercise a dangerously strong grip on this administration.
Evidence-based power
Obama’s presidency, we are repeatedly told, is evidence-based. Pundits generally credit computerizedpolitical targeting with winning him the 2012 election. The apparent neutrality and objectivity of data have obvious appeal to anyone scarred from the George W. Bush years of faulty intelligence and ill-planned wars. As one Washington Post commentator remarked, “In some ways, this faith in data over ideology defines what it means to be part of Team Obama.”
Is it so hard to imagine, then, that having attained the presidency by outcomputing his competitors and subsequently inheriting the world’s largest computerized surveillance and intelligence apparatus, Obama would reason that he could outcompute shadowy terrorist networks?
The pieces explaining how this works have only started to come together. Until recently, the Obama administration’s two most controversial national security policies — blanket surveillance and so-called targeted killing — tended to be treated separately. But one of the underlying insights of the reporting from the leaks by National Security Agency whistle-blower Edward Snowden is that these are two sides of the same coin.
The inaugural report from the Intercept —a site from the journalism startup First Look and launched partly by the journalists who broke the Snowden story — starts to make this link explicit. It contains a telling insight from a former Joint Special Operations Command drone operator about the dominance of magical thinking about data in the drone program. The NSA “will develop a pattern,” he says, “where they understand that this is what this person’s voice sounds like, this is who his friends are, this is who his commander is, this is who his subordinates are. And they put them into a matrix. But it’s not always correct. There’s a lot of human error in that.”
The drone, then, is merely the sharp end of a system of unprecedented scope and power that the Obama administration uses to gather and assess target information. This is the method we now face: faith in data has seeped well beyond polling or health care to control judgments about who lives and dies. Given the tragic mistakes that Reprieve and others have exposed, something about this system is evidently broken.
Lost in translation
While details on the administration’s target-selection program remain patchy, the political imperatives that make it tick are clear enough. The same reasons the Obama administration prefers drone strikes in undeclared war zones to controversial special-forces operations — or to the slow and uncertain progress of aid or police work — also lead it to rely very heavily on signals intelligence, or sigint.
Sigint must appear to the administration to be national security on the cheap — if not financially, then in every other sense. The surveillance web’s perceived human and political cost (until the system was exposed) was minimal. Placing agents in terrorist organizations takes time and is fraught with peril — as the devastating attack at the Khost CIA station in 2009 would have taught the administration. (At Khost, a man the CIA believed to be their agent blew himself up at a meeting with his handlers, causing the greatest loss of CIA lives in over 25 years.) One can imagine Obama musing in the wake of such an attack, “Why not just listen to the terrorists when they speak among themselves?”
Well, for one, humans tend to misinterpret data. The former drone operator presses this point the Intercept: “There is a saying at the NSA, ‘Sigint never lies.’ It may be true that sigint never lies, but it’s subject to human error.”
For a host of reasons, a glossy computer report often masks a very different reality. Anyone who has ever opened a disappointing online purchase — or gone on a disappointing online date — has experienced this in a trivial way. Nor is this problem limited to sigint. Hundreds of hours that I’ve spent poring over human intelligence (humint) reporting in litigation involving Guantanamo detainees taught me the disturbing tendency of computerized intelligence dissemination to mask bias and error — and for mistakes, once recorded, to spread like viruses through any intelligence database. Former intelligence professionals have underscored this problem, saying that it grew worse after 9/11. A piece of misinformation would be excerpted again and again in reports, often stripped of crucial context that would paint the information in an entirely different light.
Meanwhile, the United States’ humint networks in, say, Yemen and Somalia are thin; we apparently have Saudi and U.K. spy work to thank for the seizure of the 2012 underwear-bomb prototype, for example. Misreading of activity-based intelligence is the likeliest explanation for most civilian drone deaths. Again, this is not to say other forms of intelligence are faultless. What happened at Khost is a gruesome testament to that, and what's more, profit-seeking sources have offered false humint that sent many prisoners I represent to Guantanamo Bay. Unreliable sources in the Yemeni security establishment may well have caused innocent drone deaths too.
In the drone war, the seductive pull of Big Data has harmed the president’s foreign policy objectives and is likely to tarnish his legacy.
But this sigint system seems to be different. Its structure operates differently on the human mind. While watching repeated scenes of carnage has apparently caused post-traumatic stress disorder in a number of drone operators, analyzing sigint is comparatively sterile. For the drone analyst tasked with selecting targets, the sheer scale and neutral format of surveillance data will reassure her. She sits at a computer; she runs her queries; the computer spits out connections. Any bias is hidden. She makes her decision.
This process will tend not to provoke critical questioning as a human source might. At such moments, Big Data functions as the equivalent of the white lab coat and the expert’s stern tone in the Milgram experiment, in which subjects followed instructions from authority figures to electrically shock people they could hear but could not see. It enables amorality and airbrushes error.
Nor, it seems, does the Obama administration’s overreliance on data end with missile strikes. Responding to queries from a CBS journalist about this December’s attack on a wedding convoy, National Security Council spokeswoman Caitlin Hayden pledged that the White House would investigate.
Yet several weeks on, the administration has spoken to no witnesses. We sent our field investigator to the community, and he took witness statements and encouraged people from the convoy to come forward. We have offered to connect U.S. officials to witnesses, but to date, no one has taken up the offer.
Faith in sigint
The NSC claims it investigates all reports of civilian casualties. So what, in a White House where Big Data rules, does an investigation even mean? That the administration has failed in any case of civilian deaths to talk to eyewitnesses or to collect witness evidence (or to request it from human rights organizations) suggests that its investigations are cursory at best.
In most cases, the tacit message from the White House is that they will review the data and decide for themselves what happened. This mentality manifests itself clearly in the latest unattributed responses about the wedding convoy strike, as reported by the Associated Press: “Lt. Gen. Joseph Votel, commander of Joint Special Operations Command, ordered an independent investigation by an Air Force general, and the White House requested another by the National Counterterrorism Center. Both concluded no civilians were killed. Votel’s staff also showed lawmakers video of the operation. Two U.S. officials who watched the video and were briefed on the investigations said it showed three trucks in the convoy were hit, all carrying armed men.”
Consider the logic: The video shows armed men in a car (in a highly armed society); therefore, there were no civilian casualties.
And what happens after investigations conclude? A tweak in the algorithms? To date, there has been no acknowledgement or public compensation for any innocents killed. Nor, it seems, is there any reassessment of using computerized targeting to make life-or-death decisions over people in societies the U.S. only dimly understands.
Figures suggest that as a counterinsurgency strategy, the drone policy is failing. Yemen researchers estimate that at the time the drone war began in Yemen, Al-Qaeda in the Arabian Peninsula counted some 300 men in its ranks. They reckon the number today is over 1,000.
The Obama team has become famous for using computers to defeat its opponents. But in the drone war, the seductive pull of Big Data has harmed the president’s foreign policy objectives and is likely to tarnish his legacy. Day and night, Obama’s intelligence staffers scour surveillance data. In the end, they might learn more by looking at their dead.
Cori Crider is a lawyer and the strategic director of Reprieve’s Abuses in Counter-Terrorism Team.
The views expressed in this article are the author's own and do not necessarily reflect Al Jazeera America's editorial policy.
Error
Sorry, your comment was not saved due to a technical problem. Please try again later or using a different browser.