• 2016 AA Editor Search
  • Get Ready for the Annual Meeting

    From t-shirts to journals, 2014 Annual Meeting Gear Shop Now
  • Open Anthropology
  • Latest AAA Podcast

  • Enter your email address to subscribe to this blog and receive notifications of new posts by email.

    Join 18,281 other followers

CEAUSSIC: Anthropologists and Analysts

Robert Albro, CEAUSSIC Chair

Robert Albro, CEAUSSIC Chair

The AAA’s Ad Hoc Commission on Anthropology’s Engagement with the Security and Intelligence Communities (CEAUSSIC) continues its work. Our main activities at present include: 1. the writing of a report to the AAA on the widely and hotly debated Human Terrain System of the U.S. Army (by the fall), 2. The editing of a casebook illustrating the diversity of kinds of practicing anthropology, including associated ethical questions, with a primary emphasis upon the security sector broadly conceived, 3. And providing support for the AAA’s ongoing ethics process. In an effort to keep our work transparent and part of the public and disciplinary discussion of all of the above, CEAUSSIC is also going to be contributing a monthly entry to the AAA’s blog. Each entry, by different CEAUSSIC members, will address topics that have arisen or that we have been thinking about, which we will continue to discuss via the blog, a discussion in which we hope you will also participate.

Anthropologists and Analysts
Posted June 8, 2009 by Robert Albro (American U), Chair of the AAA’s Ad Hoc Commission on Anthropology’s Engagement with the Security and Intelligence Communities.

Of late there have been no new revelations about the Army’s Human Terrain System, and journalistic reportage on the program has ebbed. The debate on HTS among anthropologists has slowed, too, as it either waits for new developments before ramping back up or because most, if not all, of the critical points about HTS have now been made in one or another form. We now have the benefits of the body of work of the Network of Concerned Anthropologists, the investigative reporting of freelancers like John Stanton, internal military critiques from the likes of Ben Connable and others, Roberto González’s informative published pamphlet, the work of bloggers on Savage Minds, Culture Matters, and elsewhere, documentaries such as James Der Derian’s soon-to-be-released “Culture Warriors,” and many dozens of articles written by anthropologists and non-anthropologists, online and in print.

More, no doubt, will be said about HTS. And CEAUSSIC is in fact developing its own report on HTS for the AAA, which will include the results of a formal request for information from the program itself. But, given the ebb tide of HTS reportage, I want to address one misgiving I have had about the tenor of our discussion about HTS so far: that our almost exclusive preoccupation with this program – problematic though it may be – has meant that our discussion of anthropology’s potential engagement with the security sector as a whole has been far too narrow. HTS has too often served – inappropriately in my view – as a proxy for the whole. If this has trained attention upon the potential ethical crisis HTS represents, it has also perhaps inadvertently fed the impression that HTS is representative of what work in the security environment generally entails. But, minimally, that is misleading.

Aside from some additional discussion about the Secretary of Defense’s Minerva Initiative (e.g. http://www.ssrc.org/essays/minerva/) and expressed concerns about DoD’s new culture-centric counterinsurgency doctrine (e.g. http://anthroandwar.uchicago.edu/), we have remained deeply preoccupied with HTS at the expense of the heavy lifting still yet to be done of thinking about the diversifying roles and applications of anthropology outside the traditional trappings of the academy. These circumstances include the security sector, but perhaps paint a more prosaic and less dramatic picture than the HTS affair. If current debates over anthropology in the security sector represent another reprisal of our periodic debate about differences between the discipline’s academic and applied pursuits, we aren’t talking enough (or specifically enough) about the potential transformations of, and changing relations between, these distinctions. Do they still matter in the same ways?

CEAUSSIC’s November 2007 Report (since accepted by the AAA’s Executive Board and available at: http://www.aaanet.org/pdf/FINAL_Report_Complete.pdf) made a point to note the different kinds of work in which anthropologists are involved when part of the national security community, which most generally includes: policy work, cultural training and education, organizational study, analysis work, and operational roles. HTS corresponds only to the last of these. The rest are less controversial and more humdrum, and perhaps more familiar insofar as they more directly approximate activities engaged in by those of us who have made a career behind ivy covered walls. Briefly, I want to take up perhaps the most humdrum of these, the role of “analyst.” This might seem like a case of déjà vu all over again, since various analysts have glossed their job-related activities to me over beers as the “same things you do in academia.” But how are they the same?

When I say “analyst,” I’m not talking about the kind for which you need a couch. Nor am I talking about so-called “financial analysts,” familiar to anyone from New York and no doubt keeping a low profile at the moment. I mean the ubiquitous kind, at least if you live in the vicinity of D.C. as I do, which are routinely glossed as “policy analysts” or as “security analysts.” These kinds of analysts are, of course, literally coming out of the woodwork in D.C., found: in and out of government, on K Street, among think tanks of all sorts, in the National Academies, making a living as beltway bandits, and across the security sector. This work involves both “analysis of policy” (what works, what doesn’t) as well as “analysis for policy” (what needs to be done). If the methods and activities of particular analysts vary, as a generic category, type of practice, and of employment, there exist broad similarities to which we can still point. And these include associated ethical questions, which are notably different from those of HTS-type “battlefield ethnography.”

Given that the U.S. government employs thousands of analysts (with many more kicking around as consultants), and that the HTS program employs a total of just over 400, what your run-of-the-mill anthropologist-cum-analyst does should really be a more compelling question for us than it has seemed to be. But so far we have not framed our discussion in terms bringing analysts’ activities into sharper relief. If it is certainly not meant to be comprehensive, my start in this direction builds on interviews with degreed anthropologists now working as analysts at the State Department, in the military, in the intelligence community, and for Homeland Security. Despite this, we shouldn’t forget, though, that more analysts work outside of the security sector than in it. Analysts are, for example, very much a part of growth in the professional human rights or environmental activist communities, performing tasks comparable to what we find in the security sector if on a different subject. And this includes human rights professionals working as security analysts. As well, there are analysts at RAND who write at length on the state of funding for the arts in the U.S. This is a large and eclectic group of folks and, subject-wise, they are all over the map.

A basic characteristic of analysts, in particular intelligence analysts, is that they are not “collectors.” If analysts – at the CIA, Homeland Security, Brookings, Amnesty, or the Institute for Policy Studies – deal in information, they don’t generate this information themselves. The majority of information they use originates from sources outside of the analyst’s control. As one intelligence analyst put it, “We have no contact with collectors.” Given that disciplinary objections to the HTS program have been expressed primarily as a consequence, to quote the AAA’s October 2007 statement on HTS, of the fact that HTS contractors are actively “collecting cultural and social data for use by the U.S. military,” this is a salient fact. Since garden variety security analysts don’t do this, how might that alter the ethical picture for anthropologists working in the security sector who are more like analysts than Human Terrain Team members? I don’t think we have a good answer for this, since our debate so far has not really sought to encompass the broadest range of “practice” in the security sector, either in descriptive or ethical terms. Though, studies by anthropologists of analyst communities do exist (see Johnston 2005).

In an Anthropology News article several years ago, I noted that the construction of our discipline’s CoE around the exclusive assumption of ethnographic practice (that is, the ethics of data collection) often matches up poorly with what anthropologists and other social scientists are asked to do outside of the traditional academy, making it difficult for us to identify what kind of work it actually is. As I then stated the need: “We should pay attention to the pragmatics of the knowledge frames operating in the domains of security and intelligence” (p. 5). If data collection and data analysis are disconnected activities carried out by different people, and if the analogy of ethnography is ultimately incoherent for analysts, in what ways, then, is the work of analysts akin to academia? Well, most obviously, like anthropologists, analysts are a kind of hermeneut and a type of storyteller. But the kinds of interpretation and the conditions under which such work is carried out are notably different. How this is the case deserves closer scrutiny.

At the risk of belaboring the obvious, what follows is a short list of key factors that directly inform the hermeneutic and storytelling activities of analysts, which I also briefly compare with some traditional anthropological-type activities:

1) In CEAUSSIC’s 2007 Report we described the role of intelligence analyst this way: “In general analysts are tasked with the synthesis of variegated sources of intelligence data on countries, people or scenarios of interest, and specialize in the production of written reports and assessments” (p. 36). As analysts themselves put it to me, they primarily “read and write.” If these take a variety of forms, basically analysts are in the business of producing “reports.” They are located “between data and decision-makers,” framing the information for consumption. In fact one estimate puts the number of reports written per year just among U.S. intelligence agencies at 50,000 (Thompson 2006). Writing, in ways broadly social scientific and for different publics, is one shared activity of analysts and anthropologists, but with the former writing more often for decision-makers and the latter writing more often for peers.

2) If an analyst has a background in anthropology, s/he is also going to be part of an interdisciplinary environment. Analysts might have Ph.D.s but typically come through different varieties of M.A. programs in public policy. An M.A. is deemed a sufficient terminal degree in most cases. These programs do not prominently feature our discipline’s stock-in-trade, and the principal assumptions typical of the institutional environments in which analysts work can sometimes challenge the very legibility or relevance of anthropology as a contributing expertise. Even in cases where perspectives of the discipline are valued, as one analyst put it, “We don’t pretend to be anthropologists.” Basic disciplinary methods and concepts are most often introduced by anthropologists who come to an organization from the outside.

3) Methodologically, analysts are little concerned with how to acquire data, so much as how to interpret and to arrange it. As such, people often talked of what to do with, and how to make sense out of, “raw data.” They also talked a lot about “how to read information.” Sociocultural anthropologists also do things like this, if usually with “field notes.” “Field notes” and “raw data” imply different things. The former is already a first-order interpretation of sorts, while the “raw” of the latter suggests that if they do so, analysts will tend to err on the side of empiricism – of providing arrangements or compilations of what are assumed to be known “facts.” Anthropologists, in turn, have a history of questioning assertions about the self-evident status of information qua information.

4) Analysts also described their activities as fundamentally interpretive in nature. They variously referred to: “pattern recognition,” assembling, puzzle solving, assigning meaning, finding “patterns in the noise,” looking for “anomalies and outliers,” “looking for needles in haystacks,” sorting disconnected information, transforming raw data into descriptions, explanations, and conclusions for their consumers, telling stories, or as Kelty and Marcus (2007) noted, “putting together little bits and pieces of highly disparate and diverse information.” Depending on the kind of analyst, they might rely on devices such as taxonomies to provide a standard or recognizable structure to their products. The process of interpretation is usually subject to procedural standards, as implied by soup to nuts terms like the “policy cycle” and the “intelligence cycle.” As a colleague who works with analysts observed, the ways that anthropologists organize their conclusions don’t “fit neatly into available reporting formats.” Both analysts and anthropologists engage in what James Clifford has usefully described as the movement between inscription and transcription, if in different ways.

5) Analysts bemoaned the impossibility of just keeping up with the “daily traffic” or “staying up-to-date,” and the ways this can forestall more substantive kinds of work. This chronic problem is increased by the 24-hour cycle of providing up to the second information to decision-makers. Analysts commented upon the risk of providing mere “reportage” to the detriment of in-depth analysis. If staying up-to-date in one’s field is always challenging, there exists no real equivalent to this problem among academic anthropologists.

6) While analysts indicated skepticism about their ability to predict, the reports of analysts are nevertheless often expected to include “forecasts,” “estimates,” and “assessments,” which usually incorporate recommendations for best courses of action. The extent to which forecasts are distinct from prediction is not altogether apparent. There is, however, pressure to produce accurate forecasts, which in the security sector often takes the form of data organized as inputs for computational models and simulations. It is unusual for anthropologists to have significant prior experience working with the modeling community, although some do.

7) Analysts typically do not define their own agendas. Instead, their questions are “specifically confined.” They are expected to respond to organizational priorities and to requests for information from decision-makers. As a rule, analysts do not generate individual works but contribute to collective shop tasks. Their training is usually on-the-job and tied to a given agency’s mission. Among analysts I talked to, attention was given to creating the right “products” for specific “consumers.” If perhaps the case in academic publishing, talk of products and consumers is not how anthropologists tend to refer to research outcomes and their written work. As well, if collaborations of course happen, anthropological researchers place great emphasis upon their autonomy, if not precisely any longer in the mode of rugged individualist.

8.) Analysts necessarily think about how best to provide these products to primary consumers, typically policy makers, in ways primed for the audience. Analysts in fact dedicate significant effort to thinking about how best to convey information to their decision-making audiences so that they “get it.” This imperative largely determines the form of presentation, and “summarizing” is often substituted for “nuance.” Analysts also spend time translating nuanced academic knowledge into “policy speak.” As one analyst commented, “Nuance is a hard sell.” Paraphrasing Geertz, if anthropologists tend to maintain a very long acquaintance with very small matters, nuance is part of our stock-in-trade. So the interpretive priorities of the two communities, particularly with respect to the framing of conclusions, are very different and in fact sometimes at odds.

These points are all probably fairly self-evident. So, apologies to the horse. But I hope that they help to remind us that the “analyst” – as a generic all-purpose hook upon which we can hang those activities in which anthropologists are likely to engage outside of the academy which involve reading, interpreting and writing – is a kind of practice for which the analogy between ethnography and espionage is ill-suited. Even working as an intelligence analyst, we might say you can be a “spook” but that probably doesn’t make you a “spy” – at least as a covert collector of information. We would be well-served to at once recognize the very serious ethical implications of, say, “battlefield ethnography,” while also getting on with the broader conversation about the ethics of the fullest range of relevant practice. As Kelty and Marcus (2007: 3) observed, the “culture” of these kinds of analyst communities appears “remarkably similar to academia.” How this is and is not the case is important to establish as part of our ethics process.

Analysts will tell you about their predicaments: factual inaccuracy, research bias, information overload, groupthink, too much secrecy, navigating powerful institution, and so forth. These issues are all publicly well-known and the topic of ongoing discussions among analysts themselves. From the anthropological purview, and for the anthropologist in these environments, we might add the challenges of: interdisciplinarity, working for others, dealing with different interpretive standards, what to do about nuance, promoting more critical approaches to the meaning of information, the lack of disciplinary legibility, and skepticism regarding direct reportage, the ubiquity of preferences for simulations and models, and being handcuffed to consumer-driven priorities. As a discipline we should be having a lively discussion of all of the above and more.

When asked how anthropological perspectives might enrich their work, analysts noted a variety of issues, including: correcting for the problem of mirror imaging, better understanding “the other,” learning how best to incorporate “culture” into their products for decision-makers, how to develop new conceptual tools that avoid data dumping, how to understand how people are related to one another, and improved network, discourse or media analyses. Even if uncharitably viewed as simplistic Anthro 101-type goals, we are not talking about ethnography here so much as, in the words of my CEAUSSIC colleague Kerry Fosher (2008: 10), applying “the conceptual apparatus of anthropology” to help analysts think differently about the world around them.

Given the side of the collector/analysis divide on which analysts work the ethical priorities of “do no harm,” and informed consent are less immediately relevant. We might take seriously, instead, what kinds of ethics apply to reading and report-writing. Analysts have access to eclectic sources of data. Intelligence analysts are usually “all source,” and this includes classified information like embassy cables, intercepted signals, and the new Intellipedia experiment. But most data is in fact “open source”: news reports, magazines, newspapers, government reports, peer-reviewed academic research, public surveys, press conferences, Google Earth, GIS data, databases, the legislative record, official data (e.g. the CIA World Factbook), blogs, discussion boards, web searches, and so forth. For the case of intelligence analysts –as distinct from other kinds of security or policy analysts – we might ask: in creating a report what are the ethics of the use of classified information in anthropological terms?

This is not obvious. As analysts who are familiar with anthropology’s debates will tell you, their work is not “providing specific information on particular informants.” Nor are an analyst’s products usually classified so much as are some of the “sources.” Analyst shops readily recognize the virtues of transparency, and a dialogue on what transparency means in that context is worth having. An important part of such a discussion would be to track the implications of the need to distinguish ethically between disciplinary concepts and ethnographic content, between data collection and analysis, as separate jobs. Do we have the ethical language, in short, to address the ubiquitous, if mundane, community of analysts? Comments welcome.

References

Albro, Robert
2006  Does Anthropology Need a Hearing Aid? Anthropology News 47 (8): 5.

AAA
2007  Executive Board Statement on the Human Terrain System Project. October 31. http://www.aaanet.org/issues/policy-advocacy/Statement-on-HTS.cfm.

CEAUSSIC
2007  Final Report. Delivered to the Executive Board of the American Anthropological Association on November 4. http://www.aaanet.org/pdf/FINAL_Report_Complete.pdf.

Fosher, Kerry
2008  Getting Concepts into Practice: Lessons Learned from Work with Military Organizations. Unpublished paper presented at the annual meeting of the Society for Applied Anthropology. Memphis, TN. March 25-29.

Johnston, Rob
2005  Analytic Culture in the U.S. Intelligence Community: An Ethnographic Study. Washington, D.C: Center for the Study of Intelligence.

Kelty, Christopher and George Marcus
2007  Open Source Experiments: What They Show about the Analysts’ Frustrations in Intelligence Communities. Anthropology News 48 (2): 3-4.

Thompson, Clive
2006  Open-Source Spying. New York Times. December 11.

3 Responses

  1. […] with the Security and Intelligence Communities (CEAUSSIC). In a recent post on the AAA blog, Dr. Robert Albro, analyzes the ethical implications of various forms of research done by “anthropologists as […]

  2. […] attention to ethical forms of engagement.  Professor Albro pointed out this challenge in the June blog and began to remedy the situation by discussing and describing the role of analysts in great […]

Comments are closed.

Follow

Get every new post delivered to your Inbox.

Join 18,281 other followers

%d bloggers like this: