To fix intelligence analysis you have to decide what’s broken

“More and more, Xmas Day failure looks to be wheat v. chaff issue, not info sharing issue.” – Marc Ambinder, politics editor for The Atlantic, on Twitter last night.

Marc Ambinder, a casual friend and solid reporter, has boiled down two likely avenues of intelligence “failure” relevant to the case of Umar Farouk Abdulmutallab and his attempted Christmas Day bombing on Northwest Airlines Flight 253.  In his telling, they’re apparently binary – one is true, not the other, at least for this case.

The two areas were originally signalled by President Obama in his remarks on Tuesday, when he discussed the preliminary findings of “a review of our terrorist watch list system …  so we can find out what went wrong, fix it and prevent future attacks.” 

Let’s examine these two areas of failure briefly – and what can and should be done to address them.

Information-Sharing Issue

This area has received the most attention among pundits and partisan politicians in the past several days. It is redolent with the “stovepipes” language cited properly in critiques of the intelligence community’s performance before September 11, 2001… and continuing in the years since. As President Obama put it on Tuesday:

It’s been widely reported that the father of the suspect in the Christmas incident warned U.S. officials in Africa about his son’s extremist views.  It now appears that weeks ago this information was passed to a component of our intelligence community, but was not effectively distributed so as to get the suspect’s name on a no-fly list… Had this critical information been shared it could have been compiled with other intelligence and a fuller, clearer picture of the suspect would have emerged.  The warning signs would have triggered red flags and the suspect would have never been allowed to board that plane for America… When our government has information on a known extremist and that information is not shared and acted upon as it should have been, so that this extremist boards a plane with dangerous explosives that could cost nearly 300 lives, a systemic failure has occurred.  And I consider that totally unacceptable.” – President Barrack Obama, Tuesday 12/29/2009

The problem isn’t new anymore, so no one can claim that we weren’t aware of it. The 9/11 Commission Report detailed stovepiping and information-sharing problems ad nauseum.  So those responsible need to say they’ve been working on fixing it, and have just fallen short.

I’ll cite two prominent ongoing efforts to “fix” this problem:

(1) The Information Sharing Environment (ISE): Most commentary in the past few days comes from folks who don’t know that the U.S. Government responded to the last big terrorist info-sharing crisis (9/11) by establishing an actual formal program, the official Information Sharing Environment. You can read all about it at www.ise.gov, though I advise against it unless you’ve developed a tolerance for your prescription insomnia cure and need some heavy-duty help getting to sleep. Folks don’t know about the ISE because it has been a massive – but quiet – bureaucratic exercise in disappointment (some say failure). 

The ISE is supposed to be a joint endeavor of, and to support, five communities: Intelligence, Law Enforcement, Defense, Homeland Security, & Foreign Affairs. Why, those are the very ones involved in the run-up to the latest Christmas Day debacle!  And the ISE has been “supporting” them for nearly four years now! 

Back in July 2009, the ISE released its mandatory 2009 Annual Report to the Congress on the ISE. I am the only person I know to have read the darn thing. That’s a shame – not because it’s great reading, but because it could have provoked wider acknowledgment of just how mired in bureaucratic morass this well-funded ISE boondoggle has become. It takes the reader 17 pages just to get to a section titled “Why Information Sharing is Important” (!), which is answered with the bromide that “the proper information, properly controlled, gets to the right people in time to counter terrorist threats to our people and institutions.” O-kay…. Strike one.

Strikes two and three are pages 44 and 45 of the Annual Report, which purport to chart the ISE’s progress on implementing and enforcing “Common Terrorism Information Sharing Standards Program– I cannot over-emphasize that this should be the heart of the fix for the 9/11 and now Christmas-Day failures. If you can read those two pages without shaking your head with a sad, mournful laugh, then you are a better, more credulous person than I. Enough said. Congress and the media need to take a hard look at the ISE and ask some pointed questions about what has gone wrong with this, our previous large-scale national attempt to fix the info-sharing failures. The ISE could be fixed, but only with a wholesale refocus on smaller scale, shorter term, and much much simpler solutions.

(2) The Presidential Task Force on Controlled Unclassified Information: You may say you’ve never heard of this task force, but taste the irony: it released its work-product just ten days before the Christmas attack, and the Administration garnered quite a bit of press coverage for its two main proposals.

WASHINGTON, Dec. 15 /PRNewswire — Attorney General Eric Holder and Department of Homeland Security (DHS) Secretary Janet Napolitano today announced two major steps in their efforts to implement reforms to enhance information sharing among federal, state, local and tribal law enforcement agencies and safeguard sensitive information used by the government — designed to expand joint capabilities to protect the United States from terrorist activity, violent crime and other threats to the homeland.

The Presidential Interagency Task Force on Controlled Unclassified Information (CUI), led by Attorney General Holder and Secretary Napolitano, today released a report recommending a single, standardized framework for marking, safeguarding and disseminating sensitive but unclassified (SBU) information across the federal government. SBU information refers collectively to the various designations for documents and information that are sufficiently sensitive to warrant some level of protection but that do not meet the standards for classification.

Attorney General Holder and Secretary Napolitano also announced the creation of dual Program Management Offices (PMOs) to coordinate support for state and local Fusion Centers and the Nationwide Suspicious Activity Reporting Initiative (NSI), housed within DHS and the Department of Justice (DOJ), respectively, to work in partnership to enhance information sharing between federal, state, local and tribal agencies and the private sector. Coupled with the CUI framework, these new offices represent a significant milestone toward fully implementing information sharing reforms called for following the terrorist attacks of Sept. 11, 2001.

The first proposal, for a single SBU standards framework, has been long in the making, and its implementation appears to be in the hazy future. The second proposal is one which I predict will however begin taking root right away – because it involves spending money to establish new offices, and a new bureaucratic layer of entities, the PMOs which will “coordinate” (that favored DC word) the work of all the state and local Fusion Centers, which were of course themselves set up to “coordinate” information and intelligence sharing across government levels and bodies.

Might that be valuable, if it worked? Sure – but what’s the timeline? According to the Wall Street Journal’s coverage of the new plan, the new PMO’s will support an expansion of the “program that collects and analyzes potential terrorism tips from local police officers, so that, by 2014, all states will have the capability to analyze that data and share it with other states and the federal government.” Read that again; it will take the next five years to develop that capability.

More bureaucracy has been proposed, therefore, as the answer to a problem of too much bureaucracy. I am not optimistic on the information-sharing front given the current “reform” path.

Wheat vs Chaff Issue

President Obama also reported Tuesday, after citing the information-sharing problem:

There appears to be other deficiencies as well…. [T]here were bits of information available within the intelligence community that could have and should have been pieced together.  We’ve achieved much since 9/11 in terms of collecting information that relates to terrorists and potential terrorist attacks.  But it’s becoming clear that the system that has been in place for years now is not sufficiently up to date to take full advantage of the information we collect and the knowledge we have.”

I read that as a pretty clear assessment of an analytic failure. It’s a harsh assessment particularly because it comes from the president, the primary intelligence consumer and analytic customer of the United States’ $75 billion-per-year intelligence community. The president described the failure as “totally unacceptable.”

So one must turn to the search for improvement, for ways to balance the search for wheat amidst chaff, for insightful threads amongst chaotic data streams. That’s not easy in a world of exploding data sources and a profusion of sensors – the January issue of National Defense magazine carries a story on this topic, entitled “Military ‘Swimming In Sensors and Drowning in Data’ – it’s not an optimistic depiction.

But I’m an optimist by nature, and a technologist by choice, so I tend to see the potential for future progress in addressing the issue. I’ve written before about moving beyond “Filter Failure” – I don’t believe there is “too much information,” but a lack of imagination and technical creativity in designing innovative ways to mine the chaff (as well as the wheat). The solutions aren’t entirely rooted in technology, but new technology certainly helps.

Because I’ve written on this topic often, I’ll just mention one encouraging effort undertaken by the intelligence community’s Intelligence Advanced Research Projects Activity (IARPA). Art Becker, whom I’ve known for quite a while, leads IARPA’s Knowledge Discovery and Dissemination (KDD) Program, and just announced this month that KDD will be accepting proposals until February 10, 2010 for a new KDD system or approach on analysis. The announcment reads as if in direct answer to the President’s identification of analytic shortcomings in the Abdulmutallab case:

Intelligence analysts must gather and analyze information from a wide variety of data sets that include: general references, news, technical journals and reports, geospatial data, entity databases, internal reports and more. The different terminologies, formats, data models, and contexts make it difficult to perform advanced analytic tasks across different data sets… The focus of the KDD program is to develop novel approaches that will enable the intelligence analyst to effectively derive actionable intelligence from multiple, large, disparate sources of information, to include newly available data sets previously unknown to the analyst.

The ability to quickly produce actionable intelligence from unanticipated, multiple, varied data sets require research advances in two key areas: (1) alignment of data models; and (2) advanced analytic algorithms. Making advances in these two research areas, and fully characterizing the performance of the research results using real Intelligence problems, is the focus of the IARPA Knowledge Discovery and Dissemination (KDD) Program. The KDD Program requires a combination of innovative research and the capability to develop robust prototypes.

 

IARPA’s KDD work is squarely aimed at answering the issue President Obama and others have raised about the quality of analytic collation and insights evident in the run-up to the Christmas Day bomb attempt. I’ll revisit this topic again.

What’s the Difference?

Info-sharing or wheat-from-chaff analytic failure – so what? Isn’t it just six of one, half-dozen of the other? Nope. We may not have to “choose” between them as explanations for the Christmas Day failure, as Marc Ambinder has done above, but there is a difference and I’ll explain, with an example.

Another media outlet last night waded into the fray of parcelling blame between these same two vectors, in a curious way.  The New York Times last night published online a well-reported story  but oddly, there was a subtle shift in the story’s emphasis – and its title – within an hour of publication. The original version’s headline: “Spy Agencies Failed to Share Clues on Terror Before Flight.” The revision: “Spy Agencies Failed to Collate Clues on Terror.”  I verified this by going back through my cached html history.

The first version made the rounds heavily on Twitter, as soon as it went live on the Times website. It hits clearly at an information-sharing failure. The second version uses the multiple examples of information being shared from NSA to others, from CIA to NCTC, from NCTC to others, to argue for an analytic, or wheat-from-chaff failure (“collating”).

So what happened here? Just editing? I don’t think so. The Times story is evidently based on heavy leaking from individual agencies, and counter-leaking from within ODNI. (Note to J-School students: When folks from every three-letter agency around the Beltway want to cover their butts and point fingers at another outfit, then a well-sourced story practically writes itself.)

Individual government agencies apparently want to make the case that, “Hey, we shared, but those other analysts just didn’t collate properly” – that way lies protection of your job, and your budget.

So the difference between the two vectors of failure is, generically: if information-sharing was at fault, so were multiple agencies – a network failure, a “systemic failure” to use the president’s phrase.

If poor collation and analysis was to blame, then it’s possible, maybe even likely, that the mistakes can be portrayed as isolated within one agency, or one subset of intelligence officers. In the first case, blame lies higher in the IC hierarchy; in the second, at lower and more expendable rungs.

You don’t have to buy those distinctions – or accept them as “true” – to recognize that they’re being used bureaucratically (through the media, the Congress, and IC internal channels) to apportion blame.

That’s Washington for you 🙂

Share this post on Twitter

Email this post to a friend

AddThis Social Bookmark Button

47 Responses

  1. Lewis – thanks for your continuing insightful analysis – and for pointing to relevant background material.

    Whilst the potential impact in this instance is enormous, it does strike me that there is little difference between this and the myriad of unsuccessful projects that we read about on a daily basis.

    The primary issue is not about technology, its about people issues – about leadership and execution – about clarity of purpose, and translating that into short term actions with clear accountabilities.

    The 5 year time scale you cite (re the new PMOs) ignores the fact that in this world of broadband and global connectivity, the world will have reinvented itself at least twice – in 2014 they will be tackling a 2009 problem, not a 2014 problem.

    To tackle this properly requires a whole new behaviour and leadership – not by people who got promoted because of their skills in the old world model, but by people who can demonstrate new world thinking.

    But of course, this debate is about preventing the end result in terms of terrorism – if the President is to demonstrate a real worthiness for the Nobel Peace Prize, then he also must strive to a more fundamental diplomatic resolution to what is clearly a global conflict.

    Like

  2. Allan – thank you, very thoughtful comments. You and I agree entirely, including on the silly (or worse than silly) notion of a 5-year-timeline for this type of reform. I meant to ridicule that aspect, perhaps for once my touch was too light 🙂

    Thanks for reading – I always appreciate the governmental experience you bring to issues. Happy New Year in an insane world! -lewis

    Like

    • and to you and yours, my friend – I’ll manage to raise a glass to absent friends 5 hours before you – and it will be sincere.

      Now can you get all this airport hassle sorted out before we come across the pond on vacation in January 🙂 I’m just glad that my daughter has grown out of her air-sickness that we experienced for so many years – always towards the end of the journey – can’t tell you how many times we would be up to the plane toilet to get rid of barf bags in the last hour of the flight – now that would be fun now!

      Allan

      Like

  3. Lewis,

    Terrific post, gives us insight most of us need but can’t otherwise stitch together. I’m convinced there is no shortage of brains or good will here (in this problem space). The ISE is important but as you know, until recently , gelded. NIEM is important; all that good stuff. But unless there is a relentless pursuit to drive out/rinse out error, all the best intentions will only go so far. Along with these analytic and org fixes, pressing the system at every seam with constant Red Teaming is, in my view, essential. The two go hand in hand – bottom up analytic/org fixes, tested and focused by Red Teaming. The awful news would be that we already Red Team this stuff, but I can’t believe its at a level or intensity that can rinse out systemic causes of error. Such x-boundary testing is clearly a DNI charter item. At least, you’d think so. My $.02….. Best, Zach

    Like

    • Zach, red teaming does NOT get you actionable intelligence. It only provides you with insight on possibilities. There are also many other analytic techniques that should be used today. But when all is said and done, to find specific events in the planning stages, you need focused collection, information sharing (of both collected intelligence and open source info), collaboration with global partners who may be somewhat suspect and good analysis. FInally you need leaders in operations and policy who are willing to act on the actionable intelligence that gets provided. All that involves multiple agencies (including non-US agencies) with many points of failure.

      Like

    • Zach, all good points, and while agreeing with Kelcy I’d also endorse truly _aggressive_, authoritative red-teaming. “Intensity” is an appropriate word you suggest.

      By the way you would profit from a chat or session up in your neighborhood with Kelcy. She knows whereof she speaks. In fact, she knows whereof I and practically anyone else speaks, I always consider her reaction to something before I write it because I realize she has been there, done that, shredded the t-shirt, destroyed the photos, and survived the IG investigation 🙂

      Like

      • I see Kelcy’s point, and revere her wisdom – but keepin’ it simple, If you have a leak in a pipe, a plumber is going to run a compression test, see the leak, fix, and re-run it till it holds pressure. A tire guy is going to pump a flat full of air and see where it bubbles, patch, refill, see if it holds (well, sometimes!). Our systems are much more dynamic – you can’t ever stop red-teaming because…well, because threat and risk are dynamic. But the idea is the same – throw the data into the system, see if we see it, if we grab it, use it, move it, share it – whatever. Does it result in the action it should? If not, find the last “good” point in the pipe – the “common ancestor” of failure, as they say in food safety land – focus on the next failure point(s) forward in the system, fix, repeat, etc. How else can you ever be assured you’ve got the right problem and the right solution and its working? We’re down to my “$.01″…ZT

        Like

  4. Thanks Lewis for pulling that all together and adding importance context.

    From what I’m hearing, key data that could have caused appropriate action was placed in TIDE almost as soon as it was reported (including, reportedly, the info from Abdulmutallab’s Dad). So in part that looks like info sharing was not an issue. And, TIDE and its processes did not require the ISE to build.

    Another area that might need examination is the connections between the US State Department’s visa office and the intelligence community. I believe the US Gov (probably under DoJ guidance) treats visa holders in the US as US Persons which significantly limits what the intelligence community can do to track them. So, even when a visa holder is not in the US, yet, I imagine there are big limits to what can be shared from the visa database. There must be some sharing but it is almost certainly wrapped in layers of process. A guidance memo from DoJ should be able to sort that one out right away.

    The same is probably true of passenger manifests. Since passenger manifests almost certainly contain US persons info, there are probably limits to how they are shared (like maybe to TSA only?). Another memo from DoJ or POTUS could make all that available, rapidly, to the NCTC and the rest of the IC for rapid analytics. There just needs to be some leadership decisions made on how to scrub the databases of info on good guys like us that we don’t want the IC collecting on.

    Cheers,
    Bob

    Like

    • Bob – the issue/intersections you highlight are squarely important… and particularly because they are reliant on DoJ action beyond normal bounds, I worry. The CUI Task Force could & should have been an easy one to knock out, frankly within a week – all the work had been done previously. Took 3 months to simply reiterate existing policy direction. Oddly enough I think we might see more action out of State Dept in the near term. That would be very welcome.
      Thanks for the comments
      – lewis

      Like

  5. Another point to consider: even if the information had been properly shared and evaluated and a decision made to place the suspect on the no-fly list, how long would that have taken? I’m afraid I already know the answer.

    Like

    • You’re worried about delays in government action? So you haven’t heard about the new “Pre-Screen Process”? It actually allows government officials to predetermine individual proclivity to crime and take action to forestall – oh wait, that’s the plot of Minority Report. 🙂 Never mind.

      Like

  6. Lewis –

    Great piece! I was the original CIA member of the original Information Sharing Council and co-authored the initial ISE Implementation Guide. I gotta say that it took waaaay longer than it needed to. This, of course, was under a previous administration, which shows that it doesn’t really matter who’s in charge.

    What I found during 30 yrs of government service was that many times (but certainly not always), the wishes of VSPMs (Very Senior Policy Makers) were treated as “other options to consider”.

    Regarding the latest alleged breakdown in communications (read: info sharing failure), there are many people jumping to a lot of too-early conclusions. And lets not forget that there is no “system” today that can ensure that any intelligence can be interpreted 100% correctly. Everything is a judgment call.

    Like

    • Bob – that’s got to be one of the classic lines in all government, you made me laugh out loud, and then read it to my wife and laugh again as she did as well 🙂 “Other options to consider” indeed.

      No truer words ever spoken: “Everything is a judgment call.” That’s why I don’t like seeing such cavalier (and rashly, immediately partisan) use of “intelligence analysis” as a political football, blaming “systems” whether put in place this year or under Eisenhower.

      When are you stepping back in to government service??! Your country needs you again 🙂

      thanks for reading Bob –
      lewis

      Like

  7. Outstanding post, Lew.

    IMHO, the ISE has “failed.” Too much of a top-down approach with nearly zero incentive for those at lower tiers to share meaningful data. How can anyone expect an activity like the ISE to succeed when they (a) have no expressed authority to force sharing, (b) no fiscal controls over the direction of spending on IS initiatives, and (c) attacks the problem at a system level instead of making small improvements in areas of critical vulnerability. I have yet to see any product out of the ISE that emphasizes a need or desire to rack-up strategic successes and agency/department incentives to share.

    Like

    • You’ve really got to watch that off-beat tendency you have, Brian, of dragging logic into these discussions. You will never make it as a Wise Washingtonian that way. (…that is, until I declare myself Mayor of LX2!)

      I agree with your observations. thanks for the comment – lewis

      Like

    • Brian – do you really think ISE has *failed* completely or it simply has not been as successful as it *could of been*? I generally live in a black and white world, but I don’t think it has failed completely – there is plenty of gray there.

      I don’t think (a) will ever work – (b) is the only way to go for an environment where there is no other leverage.

      Like

  8. Let me understand this correctly. First, some numbers as a civilian sees it. Increasing velocity and volume of higher quality data in the past 8 years from investments in technology. Over 100K analysts in our government. 50% of analysts in the IC hired in the past 5 years. A $75B P&L across 16 or 17 agencies to guard against systemic risk out of a $3T budget.

    My numbers might be wrong. Still that last data point should amplify the concept of leverage that I speak of softly but frequently to my clients.

    We have great data now. We have enough analysts now. We have a ‘decent’ amount of sharing through duplication. But we have never had enough analysts with the knowledge of dislocations across sub-sector specialties to actually generate information for decision makers to take positions in liquid and profitable markets. The IC does not want to take the risk of hiring the right type of analyst. I understand why. Loyalty is an issue. So, I do not see these issues changing in the next 2 years.

    Like

    • You are as usual correct – which means two things – 1, I share your thinking and yet will keep slogging away at trying to chip away at the illogicality and suboptimal practices; and 2, unfortunately the IC will rarely (and often only accidentally) hire brilliant young people who think the way they need them to think! Crazy when you really think about it.

      thanks Jenson – lewis

      Like

      • Retaining those who do sneak in must be the greatest challenge. When will the argument “people could die if the information is wrong, we need to vet it before it goes anywhere” stop trumping information sharing? There are so many counter examples – where NOT getting the information to the right people in time has caused death and near death. And there’s no counting the missed opportunities. The irony is that everyone scrambles to cover their hiney’s when there’s a clear failure to act (yet rarely are there any consequences), but everyone is afraid of the consequences if someone were to take a risk and act without checking everything six ways to Sunday. Sunday school was a looooong time ago, but I seem to remember there being no difference culpability between “sins of omissions” and “sins of comission”. When the legal handling instructions on the case file are thicker than the case file itself….

        Like

  9. Deciding what’s broken and predicting the future are very different studies. Finding broken parts in a system is one thing. Finding broken process IS an entirely different matter. What we have in the TSA/Fed Govt case is the latter.

    The TSA/Fed government operate on the exact same basis as vulnerability scanners. They look only for what they know to look for. Those darned unknown flaws lurk in the miasma and only become active when the conditions are right. This is the problem that can’t even be solved. One degree off amounts to miles when a delicate surgical procedure can cause death. So too with TSA procedures screening 100s of thousands of passengers day which is entirely based on a worker’s decision. Even if the procedures are refined to the point of a monkey being able to do them, discernment is necessary and still quite fallible.

    So, even with all the intelligence analysis in the world (not a wonderful thing) what will we have? Paranoia. Hyper-active government snooping. Further erosion of freedoms. More and more laws. More citizen apathy. Ultimately, the solution is within all of us. Just like it happened on the flight, if people see some jerk trying to ignite his man-panties they should step up and pound him into submission.

    Self-preservation and self-defense go together and everyone should start practicing it. It IS the only way we’re going to beat bad guys…which is basically taking responsibility for what NO government or law can ever accomplish for us.

    Like

    • Scott – I agree with your thrust, but I also have seen (and a couple of times, at both federal and local governmental levels, participated in) government that works – in the way that you would want it to work. By that I mean a government which is representative of a limited set of broad majority interests while respectful of non-corrosive minority interests, and government which is incented toward delivery of results, not longevity and individual self-perpetuation. But it’s rare, so of course the only option is to be aware and vigilant, but also to lend a hand and forcefully “guide” government in those proper directions, through means like the ballot box and also – occasionally – some old-fashioned rabble-rousing.

      By the way, what would you think of a counter-example to the TSA/Fed gov you cite, as in the Air-Traffic Control system? Run by government in large part, but properly incented and performing under just as much stress if not more than TSA – yet performing magnificently under almost all circumstances. Aside from rare labor-union issues (as in ’80s with PATCO), that system seems to work fairly well from the standpoint of performance to a high metric.

      Like

  10. Good blog.

    You can add me to the list of reading ISE documents.

    How long do you think it will take for the RFP to come out that studies why it will take 5 years to build the system ☺

    Like

  11. Lewis,

    excellent discussion.

    The ISE is the Maginot Line of the IC and I believe our enemies are counting on it. ISE is strategically a false concept because it assumes that sharing is the primary challenge and a panacea for CT. Information has to become intelligence and intelligence transformed into decision and action. Technology is a key component. IARPA and counterpart efforts at DARPA have been pursuing great ideas. There are also many in the IC who understand the power of these new capabilities. However, no one has developed a strategic concept for these capabilities. Instead, we have a Maginot mentality waiting for the next probe of our lines.

    Mike

    Like

    • Mike – thanks for commenting, and for being particularly (and habitually) forthright and blunt. It pains me to ponder the import of what you (and many of us) are saying… but it’s inescapable.

      New year, new decade, new opportunities for change and actual progress. That’s my mantra, and I’m determined to see that progress. Mike, I appreciate your friendship and will be following your continuing contributions in deconstructing the Maginot Line and building actual defenses and concrete analytic capabilities.
      thanks for reading -lewis

      Like

  12. Interesting thought about 5 years to determine effectiveness of a program (or “develop that capability”). IRTPA was signed into law on Dec 17, 2004 as part of reform of the intelligence community recommended by the 9/11 commission (NCTC was created from an existing element at the same time). Five years and one week later (Dec 25, 2009), came a test of whether real change can be accomplished in five years. While it could be argued that the ODNI was not operational until later in 2005, even four years should be enough to see forward progress. While we can’t afford the luxury of five years in making changes in the 21st Century, how do we break down agency rice bowls, old think and deeply hierarchical organizational bureaucracies that have been in place for 10-50 years.

    Like

    • Kelcy – very piercing comment. It made me realize: When ODNI was created, there was much talk about “Not since the 1947 national security act,” blah blah blah. I think about that decade and what was accomplished in two separate 5-year blocks. From ’41-’45, we mobilized, reorganized, and fought a global war on massive scale far beyond anything we’re doing today. From ’45-’50 we turned around and constructed an entirely new world order (oops! what a phrase), creating all-new intelligence, military (NATO etc), international (UN), economic (Bretton Woods etc) institutions at global scale – and they were SUCCESSFUL and long-lived. Oh, in the middle of that we also converted vast enemies Germany and Japan to stable democratic trading partners.

      Joe Stalin must have marveled when he watched the West rack up such vast achievements in seeming 5-Year-Plan increments… without brutal dictatorship! 🙂

      What do we boast of in this decade? I’m too ashamed to even begin the pitiable catalog.

      Here’s to a better new year and new decade.

      thanks -lewis

      Like

  13. Good thoughts by you and those who commented. Just to add a couple of points, as I listen to Tyler Drumheller on the Newshour talk about how the cable system should work.

    1. The cable system is itself part of the problem. It is just ridiculous that intelligence agencies are still using glorified telegrams to communicate information back and forth. This forces you to take extra time to share, to notice, to collate, to analyze. It creates lag and disconnects.

    2. The insistence on heirarchical work processes is just killing us. No technology will help as long as our work is done sequentially, rather than in a dynamic, parallel fashion.

    3. The concept of the intelligence cycle itself, of handoffs between collectors and analysts, is also just a perpetual lag machine. The attachement to these guild-like professions must end.

    4. Analysis is too attached still to prose and senior policymaker support as its primary function. Analysts who spend time putting together weak signals are still seen as not really analysts. No one needed to write a paper about the Nigerian; they just needed to make simple entries into databases.

    Like

    • 1. Couldn’t agree more on cables; when I was in govt we tried to introduce a new Record-Message system for defense intell, which would have looked something like Twitter actually (but not SMS-lenght limited)… could be read in normal Outlook inboxes or repurposed in any other way (shown on maps, timelines, semantically enhanced, etc etc etc)… Project has gone nowhere that I know of since I left.

      2. Hierarchies R Government. Crazy. Go flat.

      3. A speech by DIA director Jake Jacoby to the workforce in fall 2003 or early 2004 when I joined went into detail on how the intelligence cycle was going to be completely rewritten, from TPED to something more fluid, a river of data being used/tasked/answered/commented/exploited/thumb-sucked – all tagged at earliest ingestion, all discoverable immediately, all available for life. What happened to that? We made babysteps along the way, but from my perspective on the tech side we had incompatible differences among ourselves, and yet so did those on the collection and analytic sides. Makes me a firm believer in benevolent dictatorships at times…

      4. That’s correct, but there’s another aspect beyond just filling in the databases (as I know you know), explained well in Mike Tanji’s “Time – the missing ingredient” response to this column over on his fine blog Haft of the Spear – see the pingback directly below this column.
      Thanks for the great additions.

      Like

  14. […] is a mean cook, and he has his recipe mostly right, but he’s missing an […]

    Like

  15. miloux comments above are sage. The idea of “writing for the president” is a cliche that is driving intel into the ground. The president is not going to stop an attack. TSA folks and Canadian border guards are. How do we get intel (not just reports) to them? A mashup that alerts folks is just as good as a perfectly formatted report. This is not as sexy as briefing the White House but much more important.

    The product-centric view of intelligence (each agency writing on the same topic in the name of customer “tailoring”) is the primary problem at play. More “finished intelligence,” cable traffic, or reports with agency logos is not the answer.

    Intellipedia, for example, is an unstructured wiki and “fixing” this issue is more of a structured database alert thing, but utilizing some of the Intelligence Community’s myriad crowdsourcing systems must be part of the solution. More eyeballs in emergent systems can help separate the wheat from the chaff.

    The idea that all emergent knowledge systems cannot do “real” or operational work is a massive mental block that must be overcome in order to move forward.

    I’m also very concerned about the Senate hearings and studies of this issue. The folks that will be doing the talking don’t have the first clue about wikis, emergent systems, alerting mechanisms, etc. I’m afraid the same dull “we’ll do better” cliches will be spun.

    If the nation wants real solutions they’ll bring in the folks that have been working information sharing issues every day and that ain’t at the top. How many Directors answer their own email or have made a single edit to any transparent knowledge system?

    Like

    • Can I give an AMEN BROTHER! Well Put Chris, and my New Years resolution will have to be to be more vocal and less PC in addressing that need.

      -Lance.

      Like

  16. Happy New Year Lewis,

    Super post. Provocative as always. I, too, have read and continued to follow all the ISE program status materials. Have never been enamored of the structure, positioning and approach of ISE, for a variety of reasons. I recall specifically when learning of the creation of the Information Sharing Executive as a “czar-ish” role kind of adjunct to the National Security Community that the authorities, influence mechanisms, resources and reach of this small group would not be sufficient to make the kinds of changes many saw as necessary following the release of the 9/11 Commission Report. I never had a sense that the small ISE staff really knew what needed doing even they had the necessary resources and authorities. You refer to ISE as a “program” — but the truth is it was so poorly positioned that it was destined to always compete with real programs run by the Services and Agencies. As we say in the DoD, the tyranny of Title 10 pervades this issue. Until you start consolidating money and authorities and telling folks they are not allowed to build and sustain stovepipes, you won’t fix this issue.

    I think for the better part of the last decade it’s been quite well understood what the National Security Community needed to do. I believe you might have been in the auditorium at a major IC conference (Intelink perhaps?) sometime around 2004-5 timeframe, prior to the official stand-up of the ODNI structure. Bill Studeman gave us a broad and deep tour of many of the things that would have to change and pointed repeatedly to the historic opportunity being created by the establishment of ODNI. I think on many levels it’s fair to say some five years later that ODNI has not delivered on the large percentage of the change interventions that Studeman recommended. Some of them that are particularly relevant to the thread you started:

    1. Acquisition reform. ISE never stood a chance of driving acquisition reform for those key technologies necessary to create forcing functions for sharing, transparency and a true “speed-of-the-net” intelligence cycle. As we enter a new decade, Service and Agency CIOs, SAEs, CTOs and PMs still have all of the same in-house authorities and resources to build their own stovepiped networks and applications. We’re no closer to a unified IC-Net, on any domain. In fact, from where I sit (Navy IT world now), I think we’re getting ready to see the next major round of massive stovepipe building. There is no entity that apparently has the authority to fix the tyranny of stovepiped acquisition structures.

    2. Breaking the “guild” nature of our intel cycle (IC), Policy and Operations (DoD, LE, HS, etc.) disciplines. Studeman held forth on the peculiar tyranny of formulations like “TCPED” (if you remember that great acronym!) — and how we would have to create new structures to start growing more broad-based and multi-disciplined professionals across our national security aparatus. Aggressive and bold change in this area would give us at least a fighting chance that key information would more quickly land in the hands of folks who are required to act upon it. To create a culture where collectors and analysts will be more routinely aggressive about moving important information quickly to policy and operations leaders you will have to accept some messiness and break a lot of old protocols and notions of chain-of-command and lanes-in-the-road. I’m not convinced we’ve even scratched the surface of this challenge — even in wargames or exercises where it would be relatively low-risk to experiment boldly. At least on the DoD side, where I’m quite familiar with our exercise profiles since 2001, we continue to stress-test the existing bureaucracy and rigid functional stovepipes, not experiment with entirely new, organic and transparent models of sharing, collaboration, COA formulation and execution. Don’t expect DoD to lead transformation in this area — we’re so enamored with our Napoleonic staff structures and paint-by-the-numbers approach to CONPLAN/OPLAN execution we’ll never be the ones to push a new model to the community.

    Denny Blair and Jim Clapper, I believe, have it within their collective means to make meaningful change in all of this. The problem is both of them grew up driving their own peculiar stovepipes, as did many of their senior staffers. So I think culturally they are going to be inclined to lead and manage within some pretty conservative boundary conditions. Maybe Studeman would have been a better choice for ODNI a few years back. Maybe he was offered the job?

    And do you remember those early 2004 discussions with your old boss at DIA? On the matter of a unified IC-Net and DIA’s natural community leadership role in making it happen, I recall the phrase he would dismissively invoke was “world hunger.” As long as the IC and extended National Security Community have so many CIOs thinking the technical/systems side of this is “world hunger,” we’ll get what we’ve always gotten.

    All the best and Aloha,

    Dave

    Like

    • Dave, you and I agree. And, by the way, I’ve continued those conversations with my old boss… and he has always characterized the approach as “preparing the battlefield” with an ultimate goal of having the platform to address world hunger. Couldn’t feed the world at the beginning, not even the community, so needed to fix the plumbing, get the food supply chain prepped so to speak. But the goal was as you suggest, at least quietly. If he’d only had one more year (and I would have stayed for that!) -thanks for writing – lewis

      Like

  17. Lewis,
    This is a well timed and pertinent post. My comments are that no one should what has happened here to be acceptable.

    I am rather embarrassed to read this and understand that after all this time, we have several technologies that enable timely and massive dissemination of information. So I have to lean toward collation of information once aggregated into one location. Even there we have some great stuff but an analyst or group of analysts may still not be able to find what they need in order to take the actions that may stop an event from happening.

    Assuming that we have smart analysts, (and I know we do), another part of this equation is the adoption of and strategic use of the technologies that we have. This is where I see continued ignorance and delays that I think are the greatest factor in the failure that has taken place here.

    In short, I don’t think we have poor analysts. I don’t think we have poor technology for dissemination or analysis.

    I think the intel elements (read agencies) lack a sense of community and therefor cannot have a community wide strategic plan for solving a problem this complex in a timely manner, nor a method to enforce it. Compounded with a lack of adoption of the technologies and best practices in sharing and collaborating.

    I find it also disturbing that there is any finger pointing going on at all – let’s face it, that is simply affirming the lack of community I mentioned earlier. It should simply be, we failed, where did we fail, and how can we get it right so it doesn’t happen again? As for the groups you mentioned that should have laid out the vision for making this happen, I agree, we should look to people within the community with proven skills in those two areas (I know they exist).

    Happy new year, lets make it a good one.

    Like

  18. Happy New Year Lewis!

    Fascinating decomposition, but I believe you creates a false binary here. One the one hand poor recognition/cognition/analysis of the facts assocaited with Passenger 19A caused various agencies who received them separately and directly not to put much stock in them so their was no energy expanded on sharing them. Conversely because no one IC agency received all of the facts available active sharing could have linked the data points and thereby made their threat potential more recognizable to analysis —– who then could have called for more collection to close information gaps. I believe the President understands this as he is saying there was both a failure to share information within the IC as well as failure to analyze the piece parts individually, i.e this was a systemic failure

    joemaz

    Like

    • Joe – I agree. I was taking to task a journalist’s call of it being one way and not the other, when indeed it can be both. The intelligence problem, if you will, is very much a superposition case, a la Schrodinger’s cat. And I was simply pointing out that those who force an argument for one way or the other often have a particular agenda for that choice 🙂

      thanks for reading – lewis

      Like

  19. […] Over the past few days I’ve read posts by Timothy Gowers on his polymath project and by Lewis Shepard on the most recent failure to connect the dots among available pieces of intelligence in order to […]

    Like

  20. The reporting on this event has a big gap. We are wondering what kind of problem it is–sharing v. chaff–without comparing this incident to successful implementations of the system. Surely there have been cases from the past eight years in which we have received information about possible extremists who, as a result, were subsequently denied aboard a US-bound plane. Whether that stopped an attack, we do not know. But we do know that we prevented a false negative. That is what we wanted to happen in this case.

    What made those successes operationally different from this case? Where did this one go wrong? Therein lies the blame for this mishap. Nobody is asking this question, however, because a) it’s hard to access that information and b) it’s more of a question that would occur to someone with first-hand experience of these cases, someone like (as Ras mentions above) border guards and TSA agents. Even though they are more familiar with the problem than anyone else, they aren’t typical sources for journalists.

    Nor are they typical sources internally when it comes to solving these problems, and that’s why this particular problem is going to persist, despite IARPA’s KDD project. The people who are contracted to solve this problem will NEVER understand the problem until they have been subjected to it first-hand. As long as their understanding is based on a distilled version of what happens on the ground–whether that be in the form of an article or an RFP–they will be solving the problem as they understand it (eg, as a sharing problem or a noise problem), not the problem that really is.

    Also, ++ for miloux’s comment above.

    Like

    • Matt – you’re right on all counts, and yet the most important thing you point out is “Nor are they typical sources internally when it comes to solving these problems”… That’s the killer. It’s something I tried, with frustrating half-successes, to address myself when I had a government position to “solve problems” – i.e. talk to and LISTEN to those actually working the problems. And even though I have a political science, international relations background with years in those fields, and a year as an “analyst” in the Pentagon back in the Cold War, even with that in my past I was not equipped – exactly as you suggest – to incorporate well enough their needs and design well enough the flexibility to anticipate their future needs. It’s a tough one, and of course that’s why user-created mashups are attractive, but the capabilities they can provide are still relatively rudimentary.

      thanks for the thoughtful comment Matt, see you soon – lewis

      Like

  21. So every comment to this post is exclusively from the perspective of the agenda of the writer. Tech this. Living intel that. Analyst hiring or analysis outsourcing here. Contractor @ a FedGov workplace that’s not working there.

    I even threw in a couple of good phrases from the hedge fund world just to make my point.

    Did I say 2 years? I’m short this industry 5 years now.

    Like

    • So you’re saying that the comments run true to form for the problem at hand – they’re stovepiped! 🙂

      On timelines, I’ll refer you to Dave McDonald’s comment above; he mentions several instances “back in the day” when there was no ODNI, or when it was just being stood up, and he’s right – there has been very, very little progress.

      Like

  22. It’s evident that a lot of thought went into this post and the comments, but the proposed solutions seem to lack 1) a scalable way to connect IC silos, and 2) an additional layer of analysis that’s needed in addition to what exists in the formal IC organization–an ad-hoc, cross-agency meta-analysis capability (Chris Rassmussen mentioned mashups–this hints at this capability, but is the tech part, not the org part of the solution). An immediate thought org-wise is to borrow headcount in the formal organization, and add headcount with big-picture skills in a cross-agency, ad-hoc analysis capability that floats above the silos.

    I searched on “meta” in this post and came up empty–no mention of metadata, no mention of meta-analysis. I also searched the linked ISE report and came up with only one mention of metadata, on p. 25. It’s clear that the people who compiled the ISE report do not have a semantics orientation. And yet semantics is a root problem.

    The ISE report does discuss the Federal Enterprise Architecture (see pp. 26FF), but does not mention data architecture. Until the IC fixes the data architecture problem, all the issues higher in the stack will remain unsolved.

    By the way, it’s easy to tar the whole IC with the same brush, but there are some very savvy people working on data, semantics AND enterprise architecture in the IC and at the DoD. If you have the patience to read the interview here (http://www.pwc.com/en_US/us/technology-forecast/winter2010/interview-morrison-parker.jhtml) and scan our Spring 2009 issue here (http://www.pwc.com/us/en/technology-forecast/spring2009/index.jhtml), you may find these worthwhile for fundamental understanding.

    Like

    • Hey, Alan – I see from LinkedIn you went to Santa Clara, one of my favorite universities (I used to live in San Jose).

      Thanks for the thoughtful comment. Indeed I am an enormous fan of good metadata and semantic computing and related technologies. I’ve written about the space periodically, for example these recent ones
      http://bit.ly/4SHDHE (“Seeking Semantics in Government”)
      and
      http://bit.ly/70Eew0 (“Keeping Up with the Semantic Space”)

      And you know what, you’re very insightful… I actually edited out two entire paragraphs about “metadata” – focusing on the salad days of the IC Metadata Working Group, which was chaired by Tim West who worked for me at DIA. They were the center of an enormous amount of activity in the IC, and later in collaboration with DoD, on metadata standards, entity extraction, semantic software, OWL/RDF etc…. The IC MWG funded early work with MetaMatrix, mentioned in the links you provide, and I recall well our meetings with that company and many others in the semantic and linked-data space while I was in government.

      The second of the two paragraphs I edited out spelled out why that is “past tense” and the IC MWG is no more – though it has some “sort-of” successor panels, though by no means the degree of centralized and collaborative work across the community. There’s excellent, pathbreakign work going on in individual IC agencies… but just like before I arrived, the projects are disconnected from one another, and in some cases duplicative, with a waste of government moneya and a lack of the combinatorial value of shared work.

      I edited those out because … the post was already too long! Just like this reply to your comment 🙂 Thanks for reading -lewis

      Like

  23. Amazing post, you have stated some excellent points , Also I think this a very excellent web site.

    Like

Leave a comment