Debating Big Data for Intelligence

I’m always afraid of engaging in a “battle of wits” only half-armed.  So I usually choose my debate opponents judiciously.

Unfortunately, I recently had a contest thrust upon me with a superior foe: my friend Mark Lowenthal, Ph.D. from Harvard, an intelligence community graybeard (literally!) and former Assistant Director of Central Intelligence (ADCI) for Analysis and Production, Vice Chairman of the National Intelligence Council – and as if that weren’t enough, a past national Jeopardy! “Tournament of Champions” winner.

As we both sit on the AFCEA Intelligence Committee and have also collaborated on a few small projects, Mark and I have had occasion to explore one another’s biases and beliefs about the role of technology in the business of intelligence. We’ve had several voluble but collegial debates about that topic, in long-winded email threads and over grubby lunches. Now, the debate has spilled onto the pages of SIGNAL Magazine, which serves as something of a house journal for the defense and intelligence extended communities.

SIGNAL Editor Bob Ackerman suggested a “Point/Counterpoint” short debate on the topic: “Is Big Data the Way Ahead for Intelligence?” Our pieces are side-by-side in the new October issue, and are available here on the magazine’s site.

Mark did an excellent job of marshalling the skeptic’s view on Big Data, under the not-so-equivocal title, Another Overhyped Fad.”  Below you will find an early draft of my own piece, an edited version of which is published under the title A Longtime Tool of the Community”:

Visit the National Cryptologic Museum in Ft. Meade, Maryland, and you’ll see three large-machine displays, labeled HARVEST and TRACTOR, TELLMAN and RISSMAN, and the mighty Cray XMP-24. They’re credited with helping win the Cold War, from the 1950s through the end of the 1980s. In fact, they are pioneering big-data computers.

Here’s a secret: the Intelligence Community has necessarily been a pioneer in “big data” since inception – both our modern IC and the science of big data were conceived during the decade after the Second World War. The IC and big-data science have always intertwined because of their shared goal: producing and refining information describing the world around us, for important and utilitarian purposes

What do modern intelligence agencies run on? They are internal combustion engines burning pipelines of data, and the more fuel they burn the better their mileage. Analysts and decisionmakers are the drivers of these vast engines, but to keep them from hoofing it, we need big data.

Let’s stipulate that today’s big-data mantra is overhyped. Too many technology vendors are busily rebranding storage or analytics as “big data systems” under the gun from their marketing departments. That caricature is, rightly, derided by both IT cognoscenti and non-techie analysts.

I personally get the disdain for machines, as I had the archetypal humanities background and was once a leather-elbow-patched tweed-jacketed Kremlinologist, reading newspapers and HUMINT for my data. I stared into space a lot, pondering the Chernenko-Gorbachev transition. Yet as Silicon Valley’s information revolution transformed modern business, media, and social behavior across the globe, I learned to keep up – and so has the IC. 

Twitter may be new, but the IC is no Johnny-come-lately in big data on foreign targets.  US Government funding of computing research in the 1940s and ‘50s stretched from World War II’s radar/countermeasures battles to the elemental ELINT and SIGINT research at Stanford and MIT, leading to the U-2 and OXCART (ELINT/IMINT platforms) and the Sunnyvale roots of NRO.

In all this effort to analyze massive observational traces and electronic signatures, big data was the goal and the bounty.

War planning and peacetime collection were built on collection of ever-more-massive amounts of foreign data from technical platforms – telling the US what the Soviets could and couldn’t do, and therefore where we should and shouldn’t fly, or aim, or collect. And all along, the development of analog and then digital computers to answer those questions, from Vannevar Bush through George Bush, was fortified by massive government investment in big-data technology for military and intelligence applications.

In today’s parlance big data typically encompasses just three linked computerized tasks: storing collected foreign data (think Amazon’s cloud), finding and retrieving relevant foreign data (Bing or Google), and analyzing connections or patterns among the relevant foreign data (powerful web-analytic tools).

Word Cloud Big Data for IntelligenceThose three Ft. Meade museum displays demonstrate how NSA and the IC pioneered those “modern” big data tasks.  Storage is represented by TELLMAN/RISSMAN, running from the 1960’s throughout the Cold War using innovation from Intel. Search/retrieval were the hallmark of HARVEST/TRACTOR, built by IBM and StorageTek in the late 1950s. Repetitive what-if analytic runs boomed in 1983 when Cray delivered a supercomputer to a customer site for the first time ever.

The benefit of IC early adoption of big data wasn’t only to cryptology – although decrypting enemy secrets would be impossible without it. More broadly, computational big-data horsepower was in use constantly during the Cold War and after, producing intelligence that guided US defense policy and treaty negotiations or verification. Individual analysts formulated requirements for tasked big-data collection with the same intent as when they tasked HUMINT collection: to fill gaps in our knowledge of hidden or emerging patterns of adversary activities.

That’s the sense-making pattern that leads from data to information, to intelligence and knowledge. Humans are good at it, one by one. Murray Feshbach, a little-known Census Bureau demographic researcher, made astonishing contributions to the IC’s understanding of the crumbling Soviet economy and its sociopolitical implications by studying reams of infant-mortality statistics, and noticing patterns of missing data. Humans can provide that insight, brilliantly, but at the speed of hand-eye coordination.

Machines make a passable rote attempt, but at blistering speed, and they don’t balk at repetitive mindnumbing data volume. Amid the data, patterns emerge. Today’s Feshbachs want an Excel spreadsheet or Hadoop table at hand, so they’re not limited to the data they can reasonably carry in their mind’s eye.

To cite a recent joint research paper from Microsoft Research and MIT, “Big Data is notable not because of its size, but because of its relationality to other data.  Due to efforts to mine and aggregate data, Big Data is fundamentally networked.  Its value comes from the patterns that can be derived by making connections between pieces of data, about an individual, about individuals in relation to others, about groups of people, or simply about the structure of information itself.” That reads like a subset of core requirements for IC analysis, whether social or military, tactical or strategic.

The synergy of human and machine for knowledge work is much like modern agricultural advances – why would a farmer today want to trudge behind an ox-pulled plow? There’s no zero-sum choice to be made between technology and analysts, and the relationship between CIOs and managers of analysts needs to be nurtured, not cleaved apart.

What’s the return for big-data spending? Outside the IC, I challenge humanities researchers to go a day without a search engine. The IC record’s just as clear. ISR, targeting and warning are better because of big data; data-enabled machine translation of foreign sources opens the world; correlation of anomalies amid large-scale financial data pinpoint otherwise unseen hands behind global events. Why, in retrospect, the Iraq WMD conclusion was a result of remarkably-small-data manipulation.

Humans will never lose their edge in analyses requiring creativity, smart hunches, and understanding of unique individuals or groups. If that’s all we need to understand the 21st century, then put down your smartphone. But as long as humans learn by observation, and by counting or categorizing those observations, I say crank the machines for all their robotic worth.

Make sure to read both sides, and feel free to argue your own perspective in a comment on the SIGNAL site.

Increasing Jointness and Reducing Duplication in DoD Intelligence

Today I’m publishing an important guest-essay, with a brief introduction.  Last month the Wall Street Journal published a 12-part online series about college graduates and their paths to success, featuring surveys and input from job recruiters. One thing caught my eye, at least when blogged by an acquaintance, Prof. Kristan Wheaton of the Mercyhurst College Institute Of Intelligence Studies. The WSJ’s study included a look at recent graduates’ job satisfaction in their new careers, and as Prof. Wheaton strikingly put it in his own blogpost:

Intelligence Analysts are Insanely Happy.” 

I’m pretty sure that’s not really true by and large; Prof. Wheaton seems slightly dubious as well. Many readers of this blog are intelligence analysts themselves, so I’d love to hear from you (in comments or email) about your degree of giddyness….

We all know that the intelligence-analysis field as currently practiced in U.S. agencies bears many burdens weighing heavily on job satisfaction, and unfortunately weighing on successful performance.  Our youngest and our most experienced intelligence analysts have been battling those burdens. 

One analyst has now put constructive thoughts on paper, most immediately in response to a call by Defense Secretary Bob Gates asking DoD military and civilian employees to submit their ideas to save money, avoid cost, reduce cycle time and increase the agility of the department (see more about the challenge here).  

Continue reading

Contributing to Intelligence Innovation

Below are two ways to contribute to innovation in government, and specifically in intelligence matters. One is for you to consider, the other is a fun new path for me.

Continue reading

Bing vs Google, the quiet semantic war

On Wednesday night I had dinner at a burger joint with four old friends; two work in the intelligence community today on top-secret programs, and two others are technologists in the private sector who have done IC work for years. The five of us share a particular interest besides good burgers: semantic technology.

Oh, we talked about mobile phones (iPhones were whipped out as was my Windows Phone, and apps debated) and cloud storage (they were stunned that Microsoft gives 25 gigabytes of free cloud storage with free Skydrive accounts, compared to the puny 2 gig they’d been using on DropBox).

But we kept returning to semantic web discussions, semantic approaches, semantic software. One of these guys goes back to the DAML days of DARPA fame, the guys on the government side are using semantic software operationally, and we all are firm believers in Our Glorious Semantic Future.

Continue reading

To fix intelligence analysis you have to decide what’s broken

“More and more, Xmas Day failure looks to be wheat v. chaff issue, not info sharing issue.” – Marc Ambinder, politics editor for The Atlantic, on Twitter last night.

Marc Ambinder, a casual friend and solid reporter, has boiled down two likely avenues of intelligence “failure” relevant to the case of Umar Farouk Abdulmutallab and his attempted Christmas Day bombing on Northwest Airlines Flight 253.  In his telling, they’re apparently binary – one is true, not the other, at least for this case.

The two areas were originally signalled by President Obama in his remarks on Tuesday, when he discussed the preliminary findings of “a review of our terrorist watch list system …  so we can find out what went wrong, fix it and prevent future attacks.” 

Let’s examine these two areas of failure briefly – and what can and should be done to address them.

Continue reading

The Purple History of Intelink

When I first began talking with DIA CIO Mike Pflueger and Deputy CIO Mark Greer in the fall of 2003 about the work I’d be doing with them inside government, most of the ideas were big ones: let’s re-architect the DoDIIS enterprise, let’s find and deploy revolutionary new analytical software. One of our thoughts was a little one, but for me personally it turned out to be a most valuable project. They let me pull together a panel for the upcoming 2004 DoDIIS Conference called “Geeks and Geezers,” featuring some of the grand old names of intelligence technology. The panel was a success, and in organizing it, I spent quite a bit of time talking to those giants, or should I say listening to them. I learned an enormous amount about “the early days.” This post describes the important work of one of those fellows. 

Undercover Grrl Band Techno Rave

Friday I had an interesting meeting with Dawn Meyerriecks, who has just begun her new role as the Deputy Director of National Intelligence for Acquisition and Technology. (Read the DNI’s statement on her appointment here in pdf, her bio here, and some reaction – all positive – here and here.)

Never mind what we actually were talking about, she asked me in so it isn’t appropriate to write about that. But to be honest I spent my drive home thinking about the atmospherics and significance of her holding that post in any case.  In a companion post later (“The Purple History of Intelink“) I’ll comment on the significance of her prior background in the Defense Department.

But more striking, right off the bat, is the fact that DNI Dennis Blair has an impressive number of women in high-ranking senior leadership positions. And it’s not just the number, but the particular positions they hold that I like: Dawn Meyerriecks is DDNI/A&T, Priscilla Guthrie is Assistant DNI and Chief Information Officer, Marilyn Vacca is Assistant DNI and Chief Financial Officer. Lisa Porter leads the Intelligence Advanced Research Projects Agency IARPA (I’ve written about her before). Continue reading

A-Space Past and Future

This week marks the second anniversary of the first live internal demo of the intelligence community’s A-Space project, groundbreaking for the IC in its goal of collaborative use of social media across agency lines. Somewhere in Maryland, a remarkable government employee and friend named Mike Wertheimer should pause and quietly celebrate the fruition of his early evangelism for it.

I was still a government employee then, but wrote about the effort at the time here on Shepherd’s Pi (“A-Space: Top-secret social networking“). It makes me chuckle to remember back to those days when it was still mostly unheard-of for IC employees to blog openly on the public web about current technology projects. Now you can’t shut ’em up! 🙂

It made sense, I thought, to set down a few notes at the time for several reasons: Continue reading

Cyber Deterrence Symposium webcast

As I type this, I’m sitting in a seventh-floor conference area at George Washington University’s Elliott School of International Affairs, listening to the keynote speaker for the second of five panels today in the “Cyber Deterrence Symposium,” a joint production of INSA (the Intelligence and National Security Alliance), and the Homeland Security Policy Institute.

If you’re reading this on the day of the symposium (Monday November 2, 2009), you can tune in to the live webcast of the speakers and panels. It is a stellar line-up, see the roster below.

Continue reading

Seeking Semantics in Government

Anyone who uses Twitter and has to cram thoughts in to 140 characters knows that technology doesn’t always mix well with “semantic meaning.” That reminds me of an old Hollywood story (here’s a version from Wikipedia):

Cary Grant is said to have been reluctant to reveal his age to the public, having played the youthful lover for more years than would have been appropriate. One day, while he was sorting out some business with his agent, a telegram arrived from a journalist who was desperate to learn how old the actor was. It read: HOW OLD CARY GRANT?

Grant, who happened to open it himself, immediately cabled back: OLD CARY GRANT FINE. HOW YOU?

WashTechWashington Technology magazine has a long (overly long) feature today about semantic computing, entitled “Open Government Looks for New Technologies.”  It has nothing to do with Cary Grant, but I have a few minor quibbles with the article (written by a freelancer from New York).

The premise is in the subhead: “Web 3.0 could help make Obama’s dream of government transparency a reality.”  The article goes on to give a basic – very basic – primer on semantic tagging and its potential application in government uses. Underline that word, “potential.”

Aside from the new Data.gov website’s use of minimal Dublin-Core metadata, there’s no actual government use cited. In fact, despite the premise, the article actually contains more evidence that government agencies are actively shying away from adopting semantic approaches. A spokesperson for GSA is typical, saying only that ““We are monitoring the situation as the technology matures; it is not factoring into our business requirements at this point.”  And a spokesperson for the site at www.Recovery.gov, now controversial for the manner in which it was contracted out, says they are “focusing on other priorities.” 

Continue reading

%d bloggers like this: