About the Other Intelligence Leadership Opening

Yes, the news is abuzz with leadership turnover in the Intelligence Community. Wait – not that news about outgoing DNI Dan Coats, the important news: it’s time for applications to the prestigious AFCEA International Intelligence Committee, the premier outside body of experts and insiders, government officials and business/academic leaders, who oversee the intelligence-related activities of the 32,000+ member organization.

Private-sector applicants for four-year terms can apply at this link. (Hurry! You have only a week till the deadline, midnight August 5, 2019.)  You can do the application in under an hour, not burdensome – no college transcripts 🙂

Where else in DC can you engage with Elon Musk, Jeff Bezos, hackathons on national security, and the leaders of the national-security community? My own tenure on the Committee has been fantastically fun and enlightening – and I hope helpful to AFCEA and the government agencies we advise and support. When I joined, our Committee’s Chair was the IC legend Maureen “Mo” Baginski, career NSA and FBI senior leader; and I’ve been honored to serve as vice-chair to her two successors in that seat, former DIA Director VADM Jake Jacoby, and current chair former head of Army Intelligence LTG Bob Noonan. Leaders from the world’s biggest defense/intelligence contractors are members, as well as successful startup leaders in the nat-sec space. We benefit from active emeritus members who continue to participate like CIA’s legendary Charlie Allen and the world’s preeminent scholar of intelligence Mark “Jeopardy Champion” Lowenthal.

AFCEA logo web

You must have an active TS/SCI clearance, as many of our activities and conferences are classified. And the application weighs heavily on your career track record. We especially encourage applicants diverse in origin, gender, race, background, skills, and outlook —to reflect the nation as a whole and the diversity of the intelligence mission itself, and to break chains of old thinking and get crackling new ideas on tomorrow’s most significant topics.  

In considering applicants during the selection process, these are the primary precepts we keep in mind when we consider applicants:
(1) Would the candidate further Committee efforts to build bridges between industry and the government/military? And have the contacts to do so?
(2) Would the candidate enhance AFCEA’s reputation with the Intelligence Community and the Department of Defense?
(3) Would the candidate likely be someone willing to actively engage with the Committee and help advance its goals?
(4) Would the candidate further thought leadership and innovation within AFCEA, to include involvement with other AFCEA Committee efforts?

A key element in the Committee’s success is close ties with the Intelligence Community’s and the military services’ intelligence and cybersecurity organizations, the ability to reach into those organizations with current contacts and the relationships permitting direct interface with leadership.  These are key to high-content industry days, quiet advisory engagement at the government’s request, speakers for meetings and the regular engagements that keep AFCEA Intelligence in the fore in their planning.  We have Committee members who have been in the forefront of these efforts who are rotating off the Committee or have rotated off in recent years, and we encourage recently retired senior civilian leaders and military members as well to apply.

When I was elected not long after leaving government service and going back to the tech industry, I wrote here about the Committee’s history and prominence, and that I was “honored to be elected” to this “prestigious collection of some of the smartest minds in that field.” I was tempted to respond then with William F. Buckley’s great line from his quixotic and unsuccessful Mayoral campaign in New York City in 1965, when he was asked what his first act would be if elected: “Demand a recount!

Once again, here’s that link to the application site. Good luck!

 

 

 

 

Meet the Future-Makers

Question: Why did Elon Musk just change his Twitter profile photo? I notice he’s now seeming to evoke James Bond or Dr. Evil:

twitter photos, Elon v Elon

I’m not certain, but I think I know the answer why. Read on… Continue reading

Debating Big Data for Intelligence

I’m always afraid of engaging in a “battle of wits” only half-armed.  So I usually choose my debate opponents judiciously.

Unfortunately, I recently had a contest thrust upon me with a superior foe: my friend Mark Lowenthal, Ph.D. from Harvard, an intelligence community graybeard (literally!) and former Assistant Director of Central Intelligence (ADCI) for Analysis and Production, Vice Chairman of the National Intelligence Council – and as if that weren’t enough, a past national Jeopardy! “Tournament of Champions” winner.

As we both sit on the AFCEA Intelligence Committee and have also collaborated on a few small projects, Mark and I have had occasion to explore one another’s biases and beliefs about the role of technology in the business of intelligence. We’ve had several voluble but collegial debates about that topic, in long-winded email threads and over grubby lunches. Now, the debate has spilled onto the pages of SIGNAL Magazine, which serves as something of a house journal for the defense and intelligence extended communities.

SIGNAL Editor Bob Ackerman suggested a “Point/Counterpoint” short debate on the topic: “Is Big Data the Way Ahead for Intelligence?” Our pieces are side-by-side in the new October issue, and are available here on the magazine’s site.

Mark did an excellent job of marshalling the skeptic’s view on Big Data, under the not-so-equivocal title, Another Overhyped Fad.”  Below you will find an early draft of my own piece, an edited version of which is published under the title A Longtime Tool of the Community”:

Visit the National Cryptologic Museum in Ft. Meade, Maryland, and you’ll see three large-machine displays, labeled HARVEST and TRACTOR, TELLMAN and RISSMAN, and the mighty Cray XMP-24. They’re credited with helping win the Cold War, from the 1950s through the end of the 1980s. In fact, they are pioneering big-data computers.

Here’s a secret: the Intelligence Community has necessarily been a pioneer in “big data” since inception – both our modern IC and the science of big data were conceived during the decade after the Second World War. The IC and big-data science have always intertwined because of their shared goal: producing and refining information describing the world around us, for important and utilitarian purposes

What do modern intelligence agencies run on? They are internal combustion engines burning pipelines of data, and the more fuel they burn the better their mileage. Analysts and decisionmakers are the drivers of these vast engines, but to keep them from hoofing it, we need big data.

Let’s stipulate that today’s big-data mantra is overhyped. Too many technology vendors are busily rebranding storage or analytics as “big data systems” under the gun from their marketing departments. That caricature is, rightly, derided by both IT cognoscenti and non-techie analysts.

I personally get the disdain for machines, as I had the archetypal humanities background and was once a leather-elbow-patched tweed-jacketed Kremlinologist, reading newspapers and HUMINT for my data. I stared into space a lot, pondering the Chernenko-Gorbachev transition. Yet as Silicon Valley’s information revolution transformed modern business, media, and social behavior across the globe, I learned to keep up – and so has the IC. 

Twitter may be new, but the IC is no Johnny-come-lately in big data on foreign targets.  US Government funding of computing research in the 1940s and ‘50s stretched from World War II’s radar/countermeasures battles to the elemental ELINT and SIGINT research at Stanford and MIT, leading to the U-2 and OXCART (ELINT/IMINT platforms) and the Sunnyvale roots of NRO.

In all this effort to analyze massive observational traces and electronic signatures, big data was the goal and the bounty.

War planning and peacetime collection were built on collection of ever-more-massive amounts of foreign data from technical platforms – telling the US what the Soviets could and couldn’t do, and therefore where we should and shouldn’t fly, or aim, or collect. And all along, the development of analog and then digital computers to answer those questions, from Vannevar Bush through George Bush, was fortified by massive government investment in big-data technology for military and intelligence applications.

In today’s parlance big data typically encompasses just three linked computerized tasks: storing collected foreign data (think Amazon’s cloud), finding and retrieving relevant foreign data (Bing or Google), and analyzing connections or patterns among the relevant foreign data (powerful web-analytic tools).

Word Cloud Big Data for IntelligenceThose three Ft. Meade museum displays demonstrate how NSA and the IC pioneered those “modern” big data tasks.  Storage is represented by TELLMAN/RISSMAN, running from the 1960’s throughout the Cold War using innovation from Intel. Search/retrieval were the hallmark of HARVEST/TRACTOR, built by IBM and StorageTek in the late 1950s. Repetitive what-if analytic runs boomed in 1983 when Cray delivered a supercomputer to a customer site for the first time ever.

The benefit of IC early adoption of big data wasn’t only to cryptology – although decrypting enemy secrets would be impossible without it. More broadly, computational big-data horsepower was in use constantly during the Cold War and after, producing intelligence that guided US defense policy and treaty negotiations or verification. Individual analysts formulated requirements for tasked big-data collection with the same intent as when they tasked HUMINT collection: to fill gaps in our knowledge of hidden or emerging patterns of adversary activities.

That’s the sense-making pattern that leads from data to information, to intelligence and knowledge. Humans are good at it, one by one. Murray Feshbach, a little-known Census Bureau demographic researcher, made astonishing contributions to the IC’s understanding of the crumbling Soviet economy and its sociopolitical implications by studying reams of infant-mortality statistics, and noticing patterns of missing data. Humans can provide that insight, brilliantly, but at the speed of hand-eye coordination.

Machines make a passable rote attempt, but at blistering speed, and they don’t balk at repetitive mindnumbing data volume. Amid the data, patterns emerge. Today’s Feshbachs want an Excel spreadsheet or Hadoop table at hand, so they’re not limited to the data they can reasonably carry in their mind’s eye.

To cite a recent joint research paper from Microsoft Research and MIT, “Big Data is notable not because of its size, but because of its relationality to other data.  Due to efforts to mine and aggregate data, Big Data is fundamentally networked.  Its value comes from the patterns that can be derived by making connections between pieces of data, about an individual, about individuals in relation to others, about groups of people, or simply about the structure of information itself.” That reads like a subset of core requirements for IC analysis, whether social or military, tactical or strategic.

The synergy of human and machine for knowledge work is much like modern agricultural advances – why would a farmer today want to trudge behind an ox-pulled plow? There’s no zero-sum choice to be made between technology and analysts, and the relationship between CIOs and managers of analysts needs to be nurtured, not cleaved apart.

What’s the return for big-data spending? Outside the IC, I challenge humanities researchers to go a day without a search engine. The IC record’s just as clear. ISR, targeting and warning are better because of big data; data-enabled machine translation of foreign sources opens the world; correlation of anomalies amid large-scale financial data pinpoint otherwise unseen hands behind global events. Why, in retrospect, the Iraq WMD conclusion was a result of remarkably-small-data manipulation.

Humans will never lose their edge in analyses requiring creativity, smart hunches, and understanding of unique individuals or groups. If that’s all we need to understand the 21st century, then put down your smartphone. But as long as humans learn by observation, and by counting or categorizing those observations, I say crank the machines for all their robotic worth.

Make sure to read both sides, and feel free to argue your own perspective in a comment on the SIGNAL site.

Contributing to Intelligence Innovation

Below are two ways to contribute to innovation in government, and specifically in intelligence matters. One is for you to consider, the other is a fun new path for me.

Continue reading

Swap Panetta and Blair: A Modest Proposal

First, a quick story from when I was working in government.

Not long after the initial establishment of a “Director of National Intelligence,” the DNI CIO held an inaugural “DNI Information Sharing Conference” in Denver in the summer of 2006. I was asked to sit on a panel about “Innovation across the Intelligence Community,” representing the Defense Intelligence Agency and sharing the stage with two counterparts, from the CIA and NSA.  Our panel chair was Mr. CJ Chapla, then the Chief Technology Officer (CTO) of the old Intelink Management Office, redubbed “Intelligence Community Enterprise Services,” an office now under the Office of the DNI (ODNI). CJ asked the three of us to describe briefly the goals and projects we were each working on, and in seriatim that’s what we did for 90 minutes or so.

When it was time for questions, the very first audience-member asked: “It seems that each of you are independently working on, and paying for, very similar kinds of technology projects. It would make sense to combine or rationalize the work, so why are you continuing to do it independently?” Continue reading

%d bloggers like this: