Migrating Open Source Intelligence Insights Into Participatory Journalism

kentbye's picture
| | | | |

I have argued before that the field of Intelligence Analysis can provide many insights for how journalism could do a better job at discovering, discriminating, distilling, and disseminating knowledge.

It seems as though Open Source Intelligence advocate and founder of OSS.net Robert Davis Steele has also been suggesting that there be a migration of these analytical insights into the public domain:

Because the policymaker is inundated with contradictory information lacking methodical evaluation, a critical priority must be the transfer of the proven methods of classified intelligence analysis, to the world of unclassified information.

Steele calls it a "critical priority" to transfer these advanced analytical techniques and methodologies into the hands of ordinary citizens. This is part of Steele's larger vision for creating an open source network of NGOs, academic institutions, international organizations and potentially individual citizens that could tap into the wisdom of the electorate and create the "possibility of revolutionizing governance by revolutionizing what government can know, how it knows it, how it decides, and how it communicates both its decision and supporting information."

Steele suggests creating a public intelligence "skunk works" that would "focus on creating public intelligence sources, softwares, and services that elevate the utility of all information to all citizens all the time."

There are many unanswered questions for how Steele's vision will be implemented by the coalition of private corporations that he's building, and how much government support and cooperation he will eventually receive. But I would argue that the press should have some role to play in this type of coalition because it sounds very similar to the public interest mandate that the field of journalism aspires to fulfill.

The press is facing an economic and credibility crisis as they attempt to reinvent how they create and deliver their information products. Wall Street pressures are moving the newspaper industry towards implosion by forcing cutbacks and diminishing the amount of available resources for journalists to gather the news -- let alone introduce even more complexity to how they analyze and make sense of the endless stream of facts. But the industry is at a cross roads, and they must change or die.

There happens to be many similar dot-connecting challenges facing the US Intelligence agencies where reform has been hindered by an obsession with secrecy as well as the business models of vested interests that are more focused on "esoteric collection systems" than figuring out how to make sense of the hoards of collected data.

This post is intended to explore the parallels to these challenges and how solutions to all of these challenges can be found through the converging trajectories of Open Source Intelligence and Participatory Journalism. As Steele says,

It is essential that operational, logistics, acquisition, and other information be managed as a coherent whole, not in isolation from classified intelligence. Sharing and sense-making, not hoarding and secrecy, are the watchwords today.

The opposite of information hoarding is collaborative participation, and the opposite of secrecy is transparency. Blogging is pushing journalism to be more participatory and transparent while Steele's Open Source Intelligence initiatives are doing the same in the national security domain. In both cases, the cooperative principles of Open Source holds the keys to unlocking these potentials of the wisdom of the crowd and the trust of the electorate.

The post looks at the following issues...


The public interest mission of the press is being compromised by bottom-line pressures from Wall Street on newspapers that are causing staff cutbacks and making it even more difficult for journalists to adequately verify information.

Former Journalist Dan Gillmor sizes up the future of the newspaper industry this way:

It's painful to watch a business I care so much about commit slow suicide this way. But the financial writing is increasingly on the wall for an industry that simply can't figure out how to handle its challenges.

There will be a serious loss to society if daily newspapers -- or at least the community watchdog function they still fulfill, despite their well-chronicled flaws -- were to disappear or be disrupted while a new business model emerges. I don't know if we need newspapers (though I still read them avidly). We damn well need what newspapers do.

Jeff Jarvis relays what Paul Steiger, the managing editor of the Wall Street Journal, said about the demands that are driving the business models for this new media environment:

Whatever the business model, in order to keep getting paid, people in the blogosphere or traditional media would need to do at least one of two things very well… either provide uniquely broad credibility, which will still have value even in this revolutionary world, or uniquely exciting argument… You have to at least do one of them or you’re not going to get paid.

National newspapers and news wire services have been providing the "credibility" for public affairs bloggers and editorial writers to even have "uniquely exciting argument." Credible fact-gathering and news reporting is the backbone to the political blogosphere and critical to a well-functioning society, but this foundation is increasingly becoming weaker and weaker as former NBC News and PBS President Lawrence Grossman described to me.

A good argument can be made that not enough coverage, there's not enough news gathering that's done. There are not enough forces to gather the news, to widely get at the international authorities and others who are more remote than the folks who are trying to steer the news in a particular direction. It's never enough. It's always inadequate.

It has even been suggested by Jay Rosen that there may not even be a business model for discovering, discriminating, distilling, and disseminating knowledge.

"There is not a law of God that there needs to be a business model for everything. There may not be a business model for the Internet. The Internet may just be part of life."

The Global Director of BBC, Richard Sambrook, described to me the BBC's perspective:

One of the things about the BBC is because we're publicly funded, we don't have to look at the bottom line. We don't have to come up with a business model that works all of the time. And also, because we're publicly funded -- we're a public service broadcaster, our remit is to look at the public interest, and look at how we add value to the audience, and we also have a specific remit to innovate. So I think that we're been able to experiment with elements of social media with a greater freedom than perhaps some commercial news organizations either in the States or elsewhere have been able to do.

The BBC has had a lot more freedom to experiment with participatory journalism and to provide the necessary resources for newsgathering and investigative journalism because they are receiving taxpayer dollars to do so.

The US news organizations do not have this type of foundation, and so they are struggling to implement more cooperative and participatory models of journalism that both increase the level of interactivity while also producing news that is credible.

The promising news is that Steele is proposing an "Open Source Information System-- External" system that would divert some of the Department of Defense budget towards investing in a public intelligence network because information operations could provide a non-violent alternative to military conflict.

The Echo Chamber Project has some unique insights as to how the world of open source intelligence and participatory journalism could collide.

I've written before about how deception detection techniques could be incorporated into journalism, how Richards J. Heuer's Analysis of Competing Hypothesis (ACH) technique could be used to coordinate decentralized citizen journalism, and how ACH could provide a bridge between objective fact gathering to subjective analysis within this New Media Ecosystem Flowchart.

These types of analytical techniques can certainly provide the means for verifying information and could help journalists explain and audience members and digest complex issues.

However, there are a number of obstacles that are limiting these types of techniques from being implemented by journalists. The first is that these analytical techniques and methodologies have historically been behind the secrecy firewall of the US intelligence agencies. Newsrooms also have limited resources available for investing the time and money into such systems. And finally, journalism education does not teach a lot of the math or critical thinking skills required to use these types of techniques.

However, if Steele's vision of a public intelligence initiative comes to pass, then there would be more of an effort to help make these types of analytical techniques more available with open source technology, and into the hands of people who could use them to help think about complex issues of concern to them. The technology can help lower the barrier the entry for people using these analytical techniques and gain value from them.

As the news agencies become more and more transparent and collaborative, then there will be more involvement from citizens who are mathematically literate enough to use the tools and spread it throughout the culture.

My Collaborative Filmmaking schematic is one of the first stepping stones on my roadmap towards building an open source infrastructure with Drupal that could facilitate these types of more sophisticated analytical techniques.

As I've described before and discussed with developers here, here and here, the playlist mechanism will be the key software tool that will allow individuals to place sound bites into sequences.

The playlist feature could facilitate a collaborative decision-making mechanism by accumulating the network effects of individual decisions and discovering which sound bites are interesting. It is a filtering mechanism for discovering popular sound bites -- which will be very helpful for editing together a film -- but a "popluar" sound bite is quite different from a "true" sound bite. Evaluating the validity of the sound bites is more complicated.

Steele criticizes Google's search algorithm because it "substitutes popularity for relevance." In other words, Google's pagerank algorithm takes an objective count of the number of inbound links that a website has without any qualitative analysis of the validity of content. This lack of accounting for validity creates a loophole where an organized group of people can pollute the search algorithm by "Googlebombing" a website. This makes it possible for a group of rogue bloggers as well as spammers to artificially increase a website's pagerank and popularity.

The benefits of Google's search efficiency for discovering information far outweigh the costs of the noise that's created from this type of collective action, but the lack of transparency opens up the possibility for deliberate manipulation of Google's search results.

This brings us to the perennial questions of "How do you trust information from the Internet?" Or for that matter, "How do you trust any information?" Evaluating accuracy, validity, credibility of information depends upon the implicit reputation of the author and the context of the situation. Needless to say, it's still an unresolved problem, and I'm sure that the classified world of intelligence analysis has many insights into how to resolve it.

But finding credible information is a dilemma that Google and Wikipedia have temporarily solved by giving weight to the perception of truth rather than the actual truth -- a criticism that is echoed by this often-cited essay called"The Amorality of Web 2.0."

I've criticized Wikipedia's Neutral Point of View collaboration principle because "it gives equal weight to partisan subjective beliefs -- even when a comprehensive set of facts can clearly disprove one side or the other."

One bottleneck for the truth to becoming more relevant than the perception of truth is lack of analytic frameworks that provide an intuitive system for organizing and analyzing complex sets of information. As Richards Heuer explains in his Psychology of Intelligence Analysis book:

Simultaneous evaluation of multiple, competing hypotheses is very difficult to do. To retain three to five or even seven hypotheses in working memory and note how each item of information fits into each hypothesis is beyond the mental capabilities of most people. It takes far greater mental agility than listing evidence supporting a single hypothesis that was pre-judged as the most likely answer. It can be accomplished, though, with the help of the simple procedures discussed here.

The analytical framework that Heuer is describing is the Analysis of Competing Hypotheses that I've discussed before.

The interesting thing is that if the playlist mechanism is implemented in the way that I've described, then it could also facilitate Analysis of Competing Hypotheses evaluations.

For example, take a look at this example of an Analysis of Competing Hypothesis matrix provided in Chapter 8 of Heuer's Psychology of Intelligence Analysis book published online by the CIA:

This type of table could be formed by using "a playlist of facts E1 to E6" as the rows, and with "a playlist of hypotheses H1 to H3 that describe this playlist of facts" as the columns.

An anonymous e-mailer familiar with the ACH methodology sent me the following feedback on my post on how ACH could be applied to journalism, and this is a great description of the benefits the ACH framework:

The benefit of ACH for your purpose is that it provides a sound conceptual framework for collecting and organizing information. A simple model of how most intelligence analysts work involves three steps. When given an assignment, analysts (1) search for information, (2) assemble and organize the information in a manner designed to facilitate analysis, and (3) analyze the information to make an estimative judgment.

Most of the benefit from ACH comes from the first two steps. The requirement to identify and examine a full set of hypotheses and to focus on rejecting hypotheses drives a much broader search for information than busy analysts would otherwise pursue. The organization of that information in a matrix format with hypotheses and evidence organizes the information in an analytically useful way. It decomposes the problem into its component parts, and gets it down on paper in a simplified form that is easier to deal with.

Your program would have to provide a mechanism for collecting suggestions for hypotheses, with a single issue manager, or whatever you want to call this person, who takes the inputs and organizes them into a coherent set of hypotheses. And then some way of collecting, presenting, and commenting on the evidence. This will be a difficult software challenge. Because you will have so many diverse sources of input, I don't think the software now being developed for the Intelligence Community would be of much help.

The software challenge is not as difficult or intimidating as this e-mailer describes it considering the fact that open source CMS of Drupal is already modular enough to collect, present and comment on the evidence.

Drupal's node system allows for comments, there is a built-in folksonomy tagging functionality (that needs to be expanded to collaborative tagging), and the playlist module would can be expanded to include different types of nodes (such as flexinode or content creation kit types) so that it could be used to create lists of facts as well as lists of hypotheses that could be used to describe different lists of facts. It is also possible to implement features to track the reputation and identity of the users as well.

As of now, the implementation of something like the Analysis of Competing Hypothesis is further on down on my roadmap in Phase 7. I need to have a proof of concept for using the playlist mechanism to facilitate collaborative editing first. But something like ACH is on my radar screen, and is something that should eventually be implemented.

So it's important to note that something like ACH is not the first step on my roadmap -- it isn't critical to producing a film, but it would help ensure that the editing decisions and information within the film has been vetted and is more credible and trustworthy.

However, the larger questions are "Why would a volunteer want to use a system like the Analysis of Competing Hypotheses?" "What will motivate people to dig around in the facts on any given particular issue?" "What is the itch that ACH can help scratch?"

Jay Rosen describes a shifting paradigm in journalism that says that arguments are providing launching pads that are driving people to search out the facts:

So in the mainstream journalism world, it is natural -- it is obvious -- that the first thing you need is reliable information -- news. And from that we can have analysis. And then further down in the transaction, there's opinion.

And so a well-rounded information diet begins with facts and news, moves to analysis, and later on opinion -- which is also the stages a journalist goes through in their career. You start off being a reporter. Maybe we'll let you do some analysis pieces later on. And eventually you become a columnist.

What blogging is doing is showing that that's just a convention. It's just a convenient way of dividing up the world. And while it may be true that people get their facts first, and then they kind of want some analysis, and then they move onto opinion. It also works in the reverse.

Lots of people get engaged first through argument. And it's argument that causes them to look for information. And to me this is one of the most valuable things about blogging. It's denaturalizing the journalist's view of how the world works. Because a lot of people want to enter into the public world through the eyes and the arguments and the ideas of bloggers. And it's from there that they go in search of news stories and information.

Rosen provides a very important insight in that the blogging culture is showing that people use arguments to create a desire for the further investigation of facts, and what the truth is. A side product is that arguments can skew the facts with a specific groupthink filter. The ACH framework could be used to help resolve both of these issues by helping overcome what Steele describes as "mind-set"

"Mind-set," as so many have documented, is a very powerful filter, able to block very strong signals if they are inconsistent with one’s preconceived notions.

In other words, people are seeking out information that is going to reinforce the way that they already see the world. There isn't an easy way to create an ACH matrix of facts versus different hypotheses that could help people to think about complex issues. And so we're left with the aggregation of facts that are filtered through the single hypothesis of the trusted sources of bloggers. Civic discourse suffers when there are so many different silos of news communities that are suffering from the pitfalls of confirmation bias and groupthink.

There seems to be a lot of other overlap for what intelligence analysts are expected to do with providing information to support the policy-makers within the US Government's National Security establishment, and what journalists are expected to do with providing information to our citizens so that they know which policy-making politicians to support.

Steele envisions that this overlap will eventually be converging with Open Source Intelligence, and until that happens I think that there are some interesting insights to be gained from where journalism needs to go.

Steele provides some advice for some questions to ask when searching professional analytic and decision-support services

In evaluating those who offer "analytic" services, always ask them to present their models for analysis. More often than not, they will not have any, but will instead be relying on "bodies by the hour" doing cut and past extraction and database stuffing. That is not analysis.

What are the "models for analysis" for journalism? I'm not even sure that there are any specific models for analysis. For professional analysts, Steele recommends that:

Key personnel being proposed as analysts must demonstrate, apart from the required educational and experience credentials, an ability to break down a problem, create and test hypotheses, construct a research argument or finding, and itemize essential elements of information that are missing and that could, if found, help resolve uncertainty.

Ideally this is what journalists should be capable of doing. Whether or not these skills are adequately taught and practiced is another question. One thing that I discovered is that journalists are so gun shy in making political judgments because they really don't have the mechanisms of analytical tools that are able to cut through the public relations spin. As Jay Rosen told me:

The problem for our press is that, whenever possible, it wants to avoid making a political judgment -- sometimes for good reasons, sometimes for not-so-good reasons

There is certainly a lack of mathematical training for journalists to be able to statistically represent proportionalities and relative truths. This anecdote from New York Time columnist Nicholas Kristof is very telling (via Atrios):

A year ago, I wanted to ornament a column with a complex equation, so, as a math ninny myself, I looked around the Times newsroom for anyone who could verify that it was correct. Now, you can't turn around in the Times newsroom without bumping into polyglots who come and go talking of Michelangelo. But it took forever to turn up someone confident in his calculus - in the science section.

The following passage from Steele indicates that statistical analysis and pattern analysis and predictive analysis seems to be a pretty important component of making sense out of information.

Statistical analysis and pattern or predictive analysis and trend detection are very important aspects of modern [information operations] now that the center of gravity has shifted toward content analysis. The government should be very cautious in evaluating claimed capabilities where there is a heavy reliance on statistics packages or pre-packaged software. If the individuals engaged in the work do not have a very heavy mathematical background (multivariate analysis/data mining), then they are just blind users of software they do not understand, the equivalent of a student using a crescent wrench without the slightest idea of the physics underlying mechanics.

So multivariate analysis and data mining also seem to be part of a corps skill set for using these understanding these types of techniques because they provide the conceptual metaphors for describing and understanding complexity.

However, I think it's really important to note that this type of quantitative analysis can be extremely limiting in what it can do in making sense out of information. Steele points this out as well:

One common mistake within [information operations] is to substitute quantity for quality, and analyze chatter in relation to volume, with no real opportunity for delving into the content.

It is much easier to crank through information through algorithms that spit out a bunch of numbers, but it is much more difficult to create meaning and knowledge out of this type of quantitative reductionism. Steele describes the importance of informed qualitative insights:

The best pattern and predictive analysis is done against unstructured multi-media data, in the original languages, with a global network of [subject-matter experts] able to detect and highlight nuances in the context of the patterns found by automated analysis. Such capabilities are not linear in nature -- they detect anomalous clusters, and they also flag vacuums -- missing information that would normally be present.

The first thing to point out here is that it is the subject matter experts who are able to understand the non-linear dynamics of an issue that may not easily conform to a reductionistic and quantitative analysis. They are the most qualified individuals to help reduce this uncertainty, but they are not being properly used by either our intelligence agencies or journalistic institutions. Steele describes the problem with using experts within the intelligence world:

There is no substitute for subject-matter experts (SME). However, the current practice is biased in favor of SMEs who are captive within vendor organizations, and consequently just one layer removed from the bureaucratic mind-sets they are supporting. There is also a bias toward SMEs that are US Citizens and have clearances. This is not the most effective means of understanding the real world.

So there seem to be a combination of conflict of interests and a lack of diversity of experts used by the government. Both the intelligence agencies and professional journalists should strive to extract the insights and analysis from the most qualified subject matter experts from all over the world. Afterall, these people are much more well-informed on the specific issues than the self-interested politicians who end up calling the shots.

But the thing that is limiting journalism is their inverted pyramid style of reporting convention that gives too much headline play and credibility to politicians who may be expressing views that go against a critical mass of experts. For example, journalism professor Susan Moeller describes how the press was blindly following whatever the executive branch was saying during the build-up to the war in Iraq,

A lot of the problems with the Iraq coverage occurred because the media adhered to a "classic inverted pyramid style" where they prioritized what the most important person was saying. And what the most important person was saying was "Saddam Hussein has weapons of mass destruction." And perhaps later in the story, they would come back to you and say, "Well, there are some people who may challenge that." But they still lead with that false assertion by the President.

This type of journalism methodology leads to a situation where dissent is not seen as credible until there are a critical mass of "official sources" who are expressing the view. Most often those voices are represented by Congressional representatives. But even within the intelligence world, Steele describes the difficulty for dissenting perspectives to be considered:

Overloaded policymakers, and the all-source managers of analysts who serve them, do not like to be made uncomfortable by iconoclasts and mavericks. Not only does "the system" not search for such individuals, it actively shuts them out.

Open Source Intelligence can provide potential solutions for these issues by making the process more transparent and holding policy makers more accountable. Steel explains this by saying:

Early Warning that is classified can be safely ignored -- there are no political consequences for pretending the intelligence does not exist. In contrast, well-structured, well-documented public intelligence, ideally with strong visuals, can have a "CNN effect" on policymakers, and force them to at least consider some form of action.

Again, the fields of open source intelligence analysis and participatory journalism are on a trajectory to collide. These two fields can learn a lot from each other, and I hope to explore this intersection more with The Echo Chamber Project.

Intelligence Analysis? Your Panacea!

CIA intelligence analysts do not practice what they preach. They slant their analysis as many of their "customers" can testify. They slant their rewrite of "raw intelligence" towards the radical leftist ideology shared by 98% of them and theur comrades.
And you? You never knew this? Ha!

Educate yourself comrade. Go to www.quikmaneuvers.com
Its the only place anywhere that understands and accurately describes the CIA. DIA, and MI distortion of analysis. In fact, intelligence is better off without 20 something leftist punks skewing it with their "journalistic skills."

Jack Sigil, Former Analyst