Expert Panel Synthesis (WIP)

This page displays both the panelist responses collected during the first week, and our attempt to group and synthesise those responses into statements that will form the basis of the Collective View. It is a work in progress, and will be revised to reflect discussion that takes place on Loomio in the second week of the Expert Panel.

Expert Panel Portal

We have selected a subset of topics for initial discussion on Loomio from the full list of statements below, and you will find the links to those discussion threads next to those statements. If there is a statement you would like to discuss that does not yet have a discussion thread, you are welcome to start a new thread on Loomio, or contact us and we can create one for you (hunt-lab@unimelb.edu.au).

A PDF version of this page can be found here. A PDF version of the responses grouped by respondent, rather than by theme, can be found here.



Nature of analytic rigour

      • Analytic rigour is only one component of good intelligence.
      • Analytic rigour (AR) is a necessary but insufficient enabler of good intelligence work.Even the application of AR in concert with other means will not guarantee success.AR is not a panacea.
      • An analysis that is deemed “rigourous” can still result in a poor intelligence product.
      • Analytic rigour is just one critical aspect of intell work, and there are many others. Its relative importance also varies with intell type (eg. tactical vs strategic).
      • AR must be understood in terms of the problem type, e.g. puzzles (to be solved) or mysteries (to be framed) and tradeoffs between timeliness and thoroughness.
      • There is no universally acceptable level of analytic rigor.Every situation demands/affords a context specific level of rigor.Exceeding this threshold does not improve decision making.
      • Differentiate better between single- and multi-discipline analysis. Different factors that determine rigor for the two.
      • There is no one standard definition of analytic rigour amongst our analysts – it means different things to different people
      • Complex and controversial judgements will likely benefit from high levels of analytic rigour. Simple and non-controversial decisions less so. Analytic effort can be tailored to meet requirements.
      • Rigor is NOT about using standard analytic methods or processes. Because intelligence analytic tasks vary dramatically, it means choosing one of more effective methods and using it/them well.
      • Rigor has two dimensions – the general that applies to all products and specific implementation of rigor determined by the type of analysis.Implementation is not the same across types of analysis.
      • Rigour is a dynamic and interactive process, rather than objective, static metric. Whether an analysis is “rigorous” depends on the context and social actors involved.
      • Analytic rigor will be different based on kind of analysis or the relative mix of concept and data, or creativity required. Framing analysis as science or art will affect how rigor is operationalized.
      • Rigor can take many forms. Formal standards inherently are rigid and narrow. Formal standards may not help and may be counterproductive.
      • Rigour is a dynamic and interactive process, rather than objective, static metric. Whether an analysis is “rigourous” depends on the context and social actors involved.
      • Rigor has to be assessed from the context.An analyst yesterday commented on “rigor efficiency” being important, using ‘just enough’ resources.We refer to this as ‘right sized rigor’.A process
      • Rigor is fidelity to best practice. But there are many analysts who do many different things, so there are a lot of different kinds of best practices, and a lot of different ways to be rigorousAnalytic rigour is just one critical aspect of intell work, and there are many others. Its relative importance also varies with intell type (eg. tactical vs strategic).
      • Analytical rigor always occurs within an organization, country, & perceived world situation; requires executive to encourage ongoing standards, evaluation, and research, & much collaboration
      • COMMENT: Itis not a question for ‘perfection’, its a pragmatic achievement of ‘right sized rigor’ where efficient use of resources achieves the quality needed to satisfy the intelligence need.”
      • COMMENT: The strong advocacy of AR can amplify tension between the ‘intuitive school’ and the ‘methodology school’ of intelligence practice.
      • Avoid normative prescriptions for enhancing analytic rigour – such as, “you just need to be more disciplined in the application of structured analytic techniques”.
      • Rigor and risks to it are dynamic in nature.Exercising rigor and learning to mitigate risks should take place throughout the analysis, not just during the review of the final product.
      • Being analytically rigorous does not require the strict application of a predetermined method or framework for every analytical problem.
      • Can be internally driven, with (good/natural) analysts instinctively exploring different options, but will be improved by a systematic approach with some external structure applied to thought.
      • Applying key concepts, analytic methods and techniques that have been tried and tested but also innovating to explore new and vital ways of collecting and interpreting data.
      • Rigour is a dynamic and interactive process, rather than objective, static metric. Whether an analysis is “rigourous” depends on the context and social actors involved.
      • Rigor is not a singular measurement against an idealized post-hoc standard of ‘perfect’.Rigor has to be assessed from the perspective of making each individualdecision made along the pursuit.
      • Analytic rigour requires a thorough approach with a willingness to constantly reassess assumptions and conclusions using new information.
      • Analytic rigour must be flexible and fluid, in order to adapt to the circumstances, not reduced to sets ofmethods or standards of training.
      • Analytical rigour is not a destination but an ongoing process of continuing improvement. Its achieved via an iterative process employing cognitive and methodological processes.
      • Analytical rigour is not a static attribute. As security environment changes becomes more complex, ICs need to do more detailed work on future analytical workforce to meet those challenges.
      • It involves multiple cycles of narrowing and broadening to prevent prematurely converging on a brittle conclusion.
      • COMMENT: “Analytic rigor is a holistic measure of the robustness of the process used to create intelligence, across the entire closed loop performance of the Joint Cognitive System.
      • The rigorousness of analysis refers to the analysis product (outcomes), procedure (rules/guidelines), and/or process (actual practice). A single standard will not be equally applicable to all three.
      • Rigor is an Emergent Holistic property of Analysis – It is a holistic property of the performance of the cognitive work along the Analytic Pursuit.Assessed by the quality of each atomic decision.
      • Analytic rigour is an emergent property of a system composed of people and technology embedded within specific intelligence organizational contexts.
      • Analytic rigour involves a holistic, integratedand transparent approach/process to a question or issue
      • Analytical rigour has to be understood and supported by the entire management structure; it cannot simply be something that analysts are expected to get right on their own.
      • Analytical rigor always occurs within an organization, country, & perceived world situation; requires executive to encourage ongoing standards, evaluation, and research, & much collaboration
      • Analytic rigour is a term banded around with no commonly accepted definition. It’s bedfellows of academic and research rigour seem to have had more time and efforts to define them.
      • AR can erode the ‘art’ of intelligence, which is founded on sensemaking and insight production, often making progress through accident and serendipity rather than discipline.
      • Current conceptions of analytic rigour may be intentionally vague to shield practitioners from blame (despite evidence that this sort of strategy backfires).
      • Rigor does not seem to be the correct term for what we are talking about.
      • The term AR is problematic in that it is difficult to map to the traditional intelligence production model with its discrete ‘processing’ phase.Indeed the best value add comes from synthesis.
      • The words ‘analytic rigor’ seem ill fitting to the actual notion of analytic thoroughness. Rigor suggests that the thought process is linear and subject to some specific engineering workflow.
      • Analytic rigor is the quality of the thinking processes analysts apply to their topic of study. It implies that the analyst employs principles of science to non-scientific issues.
      • So it’s a mindset or disposition as much as a method, towards thoroughness, objectivity, honesty in applying skill to solve the problem.
      • COMMENT: Management must understand that analytic rigour is essential as a standard, but that it does not imply ‘correct’ or ‘infallible’. Managers must be accepting of remaining uncertainty and forgiving of judgements that are ultimately found to be incorrect to ensure that a culture of brave and frank analysis is fostered.
      • Analytic rigour can also be thought of as capability that is manifested at a corporate, team, and individual level
      • Implementing analytic rigour goes beyond adherence to analytic standards. It is primarily an internal capacity that must be consciously developed and cultivated by the analyst.
      • Analytic rigour is not a technique or process in itself, but more a mindset that needs to be encouraged and embedded at individual and organisational levels.
      • Rigor is an effort by an analyst or researcher to be as complete as possible in order to arrive at the most accurate assessment/results possible in conducting an analysis with integrity.
      • Analytic rigor is not the way mere humans usually think, seizing on the first plausible explanation, selectively gathering and applying evidence, and advocating a particular outcome.
      • Analytic rigour denotes a disciplined, considered and consistent approach to intelligence analysis aimed at maximising the quality and value of analytic products.
      • Standards are fixed solutions to (more or less) fixed problems. Rigour is an ambition that does not determine the specific tools, but rather the attitude toward the problem.
      • Analytic rigor is the quality of the thinking processes analysts apply to their topic of study. It implies that the analyst employs principles of science to non-scientific issues.
      • Analysis is conducted by people charged with being given the luxury of spending time on one particular area, and that knowledge is supported by processes that add rigour to the analysis.
      • Analytic rigour is not just about the quality of the analytical product, but about ensuring a reasonable way of arriving at that product.
      • Adherence to best practice processes, or at least to in-house systems, developed to ensure consistent and thorough analysis across products. Probably most correlates to the issue of ‘standards’
      • Rigor is not a singular measurement against an idealized post-hoc standard of ‘perfect’. Rigor has to be assessed from the perspective of making each individual decision made along the pursuit.
      • Rigor is not measurable based on accuracy. Rigor is a characteristic of a process while accuracy is a characteristic of a product. Optimally there is a correlation but only measurable in the aggregate
      • It is a major error to think that rigor means establishment of simple “standards.” Instead, as in academic research, use of methods consistent with methodological principles makes much better sense.
      • Analytic rigour is primarily a feature of the processes leading to intelligence products, not the products themselves.
      • Rigour is about the process, rather than the product, of intelligence analysis.
      • Standards provide goals; best practices (ideal behaviours that are encouraged) provide a pathway.
      • Analytic rigour is methodological. An individual adhering to analytic rigour standards does so mindfully and with purpose.
      • Analytic rigour is methodological. An individual adhering to analytic rigour standards does so mindfully and with purpose.
      • Some assume that analytic rigor must be validated in the same way as a scientific experiment. The metric should be that the result enables the consumer to take action no matter what happens.
      • Current conceptions of analytic rigour emphasize process accountability (e.g., thou shalt use SATs) rather than outcome accountability (i.e., accuracy monitoring).
      • ‘Right Sized Rigor’ is a measure of practical achievement of good decision-making while adapting to real world situations and pressures
      • Analytic rigour is better measured by the judgements that are wrong than are right. If I made, using the UK yardstick, 10 “likely” judgements, then in theory 2-5 should have been proven wrong.
      • It is a lever for mitigating risk in critical decision making.
      • Analytic rigour should at least in part be judged in reference to the outcomes of judgements
      • Policed standards and procedures which ensure all assurers and analysts can effectively and consistently produce and test judgements. Inconsistency undermines morale and risks lowering standards.
      • Adherence to best practice processes, or at least to in-house systems, developed to ensure consistent and thorough analysis across products. Probably most correlates to the issue of ‘standards’
      • Analytic rigour is methodological. An individual adhering to analytic rigour standards does so mindfully and with purpose.
      • Standards define the areas to be taught and evaluated. Standards should be adapted to the analytic application — they operationalize and communicate the organization’s definition of rigor/quality.
      • While important, external standards do not yield analytic rigour. One’s internal standard for rigour and professionalism is a better indicator of quality outputs and outcomes.
      • Analytic rigour must be flexible and fluid, in order to adapt to the circumstances, not reduced to sets of methods or standards of training.
      • Being analytically rigorous does not require the strict application of a predetermined method or framework for every analytical problem.
      • It is not just ‘checking the boxes’ on a set of standards.
      • Standards provide goals; best practices (ideal behaviours that are encouraged) provide a pathway.
      • Analytic rigour is not following a checklist. But, without some level of guidance and structure. Standards and quality will vary.
      • A misconception is that pure adherence to standards means rigour is is place. It is not the same thing.
      • Analytic rigour is not identical to adherence to tradecraft standards.
      • Analytical rigour is not standards – they are part of it but not all of it.
      • Establishing standards such as objectivity, accuracy, timeliness, relevance, precision, etc. are helpful but don’t necessarily contribute to nor are a replacement for analytical rigor.
      • Analysis may be bound by certain requirements or standards, but rigor should not be measured strictly through ensuring compliance with these standards.
      • Standards are vital to establish a foundation upon which can be built flexible frameworks for delivering analytical rigour.
      • It is a major error to think that rigor means establishment of simple “standards.” Instead, as in academic research, use of methods consistent with methodological principles makes much better sense.
      • Rigour is subjective terminology. Research and some IC agencies are now trying to define it. Grounding it in professional analytical standards will help build common understandings about what it is.
      • For analyst: Excellent thinking with standards of objectivity, thoroughness, timely, logic, reliability, alternative hypotheses, transparency, assumptions, credibility, relevance, implications
      • Rigor can take many forms. Formal standards inherently are rigid and narrow. Formal standards may not help and may be counterproductive.
      • Rigour vs Standards; standards set the minimum, are rigid, formulaic and process driven. Rigour should strive for excellence, embrace diversity of thought/ideas, encourages deep critical thinking.
      • Analytic rigor is more than analytic standards to be used in a tick-box culture. Think comprehensiveness and overall analytical methodology.
      • Analytical rigor always occurs within an organization, country, & perceived world situation; requires executive to encourage ongoing standards, evaluation, and research, & much collaboration
      • Analytic rigour is not the same as any one standard: the standard is merely an attempt to approach, conceptualise, and quantify the analytic rigour of actual practice.
      • Implementing analytic rigour goes beyond adherence to analytic standards. It is primarily an internal capacity that must be consciously developed and cultivated by the analyst.
      • It is a major error to think that rigor means establishment of simple “standards.” Instead, as in academic research, use of methods consistent with methodological principles makes much better sense.
      • The inherited ideas about and standards for analytic rigor are based on a scientifically obsolete understanding of the human mind as a little machine or computer
      • In fact, implementation of analytic standards can lead to low-rigour analysis. This happens, e.g., when it becomes a box ticking exercise implemented thoughtlessly to serve a bureaucratic expediency.
      • Like current uncertainty standards (e.g., Admiralty Code), a vague “laundry list” of factors contributing to rigour could limit analyst accountability and hinder post-mortem analysis.
      • Rigor is fidelity to best practice. But there are many analysts who do many different things, so there are a lot of different kinds of best practices, and a lot of different ways to be rigorous
      • Rigor is different from tradecraft standards. Tradecraft standards include objectivity and accuracy which are aspirational goals rather than standards. Rigor can’t achieve an aspirational goal.
      • Rigor is different from tradecraft standards. Tradecraft standards include objectivity and accuracy which are aspirational goals rather than standards. Rigor can’t achieve an aspirational goal.
      • I believe the formal standards documents of recent years, especially ICD 203, are simplistic and counterproductive. Intelligence did fairly well until the 1990s without them. Q: what changed?
      • Standards defined by ODNI are simple mechanical measures meant to layout what can and cannot be counted and connected that to specific collected material and previous analysis. This is trivial.
      • In the U.S. IC at least, the analytic tradecraft standards — codified in law and implemented by ICD 203 — formalize the requirements for analytic rigor. Some partners — like DHS — follow as well.
      • Fact, Evidence Analysis ensures academic rigour and allows the analyst to back up their main points briefly and succinctly, following the CRAIC principles:
      • “Intelligence analysis must have the CRAIC principles to ensure academic rigour Clear Relevant Accurate Informative Concise “
      • To be rigorous is to achieve a level of sufficiency in the application of analytical tradecraft for an objective observer to be satisfied that the analysis is ‘good enough’ for the specific situation.
      • Elements of rigor exist on a spectrum as indicators of sufficiency of: objectivity, thoroughness, transparency, credibility, relevance, and replicability/reliability/validity
      • Rigor is needed to counteract the risk of shallow analysis including premature closure and other weaknesses. Rigor is not about adherence to any particular reasoning, algorithm, or process.
      • Analytic rigor is the quality of the thinking processes analysts apply to their topic of study. It implies that the analyst employs principles of science to non-scientific issues.
      • Analytic rigor embodies a critical examination of all components of an issue and/or question, the relationships between these components, and the underlying assumptions
      • Analytic rigor requires a delicate balance between thoroughness and relevance to the task at hand. The most thoroughly done analysis may fall short if it does not adequately address the issues.
      • Analytic rigour requires a thorough approach with a willingness to constantly reassess assumptions and conclusions using new information.
      • Elements of rigor exist on a spectrum as indicators of sufficiency of: objectivity, thoroughness, transparency, credibility, relevance, and replicability/reliability/validity
      • Rigor denotes thoroughness, or completeness. In the intelligence field, it means making use of all available sources and accounting for all possible variables.
      • Rigor is an effort by an analyst or researcher to be as complete as possible in order to arrive at the most accurate assessment/results possible in conducting an analysis with integrity.
      • Rigour, for me is thoroughness. It the process of reaching and clearly articulating what was determined and how it was arrived at whilst being open minded and aware of possible biases.
      • So it’s a mindset or disposition as much as a method, towards thoroughness, objectivity, honesty in applying skill to solve the problem.
      • The greater the amount of relevant, unbiased, reliable information the greater the likelihood of a sound decision. Information biases can be taken into account if known.
      • The words ‘analytic rigor’ seem ill fitting to the actual notion of analytic thoroughness. Rigor suggests that the thought process is linear and subject to some specific engineering workflow.
      • Simple example: regardless of how sophisticated a source or algorithm, the output needs to connected to other sources to identify conflicts & to build corroboration. Doing this increases rigor.
      • Rigor is needed to counteract the risk of shallow analysis including premature closure and other weaknesses. Rigor is not about adherence to any particular reasoning, algorithm, or process.
      • A scrupulous collection, analysis and synthesis of ALL the available relevant material.
      • Analytic rigour is producing a defensible and relevant judgement based on a consideration of widest possible information within a predetermined timeframe.
      • Analytic rigour requires a consideration of as many sources of information as possible, a clear audit-trail of an analysts thought process, and a delineation between facts, assumptions, and judgements
      • Analytical rigour is making use of all available and relevant information from a variety of sources, corroborating information but also considering novel themes.
      • Rigor denotes thoroughness, or completeness. In the intelligence field, it means making use of all available sources and accounting for all possible variables.
      • Validation from a range of sources; not taking on face value the information or data refererring to historical data to make predictions about future events, threats, risks. Triangulation
      • Evidence must be deliberately examined, with consideration of alternative interpretations, reliability and authority, and possibility of deception
      • A neccessary condition is the identification and deliberate consideration of alternative explanations (for past/current events) and alternative possibilities (for future events), before deciding
      • Analytic rigor involves not only explaining the basis for the assessment but also how it might be wrong and what variables or developments might alter it
      • Analytic rigour takes into account multiple perspectives and possibilities. It requires collaboration and coordination to the extent possible.
      • Analytical rigour includes the consideration of alternative hypotheses and explanations; asking “what if we were wrong?” and taking a devil’s advocacy approach to flip something on its head.
      • data+analyze+reasoning+judgment is necessary.For many situations, also need quality unknown/unsuspected alternative explanations. Possible explanations outside norm is creative act; need this w rigor
      • Developing sound hypothesis, competing hypothesis generation and theory generation to test information and data that lead to better judgements about what the data means
      • Genuinely analytically rigorous texts should take into account a diversity of viewpoints and possible takes on the subject matter, including the possibility of human error and flawed thinking.
      • It can be easy to appear to consider more than one hypothesis, where the alternatives are only minor modifications to the leading hypothesis. This is not a broad search for possible explanations.
      • It is looking at all the possibilities present, not just the first possibility or the one that follows the previous ‘analytic line’.
      • Must involve consideration of alternatives (hypotheses, explanations, meanings, interpretations etc), so can include both creative and critical thinking to generate and evaluate a variety of options.
      • Providing something is not happening is difficult. The absence of evidence is not evidence of absence. We must think about what we would expect to see, if it was occurring.
      • Rigorous assessments address alternative hypotheses. Rather than articulating a “best estimate,” the goal of a rigorous assessment should be to assess the chances that multiple possibilities are true.
      • Analytic advice is often provided with a great deal of underpinning specialised knowledge not appearing in the information provided.
      • Analytic rigour is about developing a deep understanding of subject matter, becoming a SME but still recognising that there are valid alternative views. Accepting valid different/alternative data
      • Analytic rigour is not just about the use of expert opinion. Someone who is naive to the subject matter can challenge strongly held mindsets so the opinion of the novice is just as valuable.
      • COMMENT: A hypothesis: the shift in many countries toward current intelligence in roughly the 1990s led to the idea that relatively inexperienced analysts who could write well were adequate. This quickly became obviously untrue. “Standards” are designed to polish a bad decision rather than to reverse the decision by hiring experienced experts. The US State Department’s INR held to old standards, stayed small, and retains an excellent reputation for quality analyses.
      • Elements supporting rigour – contestability (review process), analytic tradecraft, accountability, reproducibility (record keeping of sources and methods), analytic standards, subject matter expertise
      • Most importantly, they would need to understand that analysis is only insightful if analysts understand their subject area. It is essential they possess relevant and accurate subject matter expertise.
      • The importance of Subject Matter Expertise. How can you ensure accurate academic rigour if you are unfamiliar with the subject area?
      • Analytic rigor is deliberate formulation of the intelligence question being asked, gathering and objective review of the available evidence, and clear, unbiased, and timely response to the asker.
      • Analytic rigour is not necessarily about getting the perfect or “right” answer. Imperfect information (quantity and quality), time constraints and other factors will always prevent this.
      • Analytic rigour is producing a defensible and relevant judgement based on a consideration of widest possible information within a predetermined timeframe.
      • Analytic rigour is tied to customer’s needs.This impacts on the intel question and the timeframe to address it. An analytically rigorous product that doesn’t meet a customer’s need is irrelevant!
      • AR must be understood in terms of the problem type, e.g. puzzles (to be solved) or mysteries (to be framed) and tradeoffs between timeliness and thoroughness.
      • It’s a continuous balancing act between between thoroughness and efficiency.
      • Napoleon allegedly told his subordinates, “I can give you anything but time.” Rigor should be considered/assessed based on time and resources available.
      • “Do you want rigor or do you want *right*? It took decades to rigorously ground Newton’s calculus and Heaviside’s methods. “”Shall I refuse… dinner because I do not… understand… digestion?”””
      • Analytic rigor has little to do with accuracy. Predictive analytic accuracy is impossible due to the warning paradox where actions taken make the predictions void.
      • Analytic rigour is a means to achieve an end: reaching valid conclusions
      • Analytic rigour is not necessarily about getting the perfect or “right” answer. Imperfect information (quantity and quality), time constraints and other factors will always prevent this.
      • Conclusions which used to be true at one time, but have since changed, been overturned, or ‘overtaken by events’ need to be updated prior to distribution to a customer
      • Current conceptions of analytic rigour emphasize process accountability (e.g., thou shalt use SATs) rather than outcome accountability (i.e., accuracy monitoring).
      • Rigor cannot be determined by the accuracy of the analytic outcomes, a lucky guess may be correct, but it is not rigorous analysis.
      • The duel purpose of AR is to result in more accurate assessments and to more transparently display the justification of those judgements to the reader. Both purposes are equally important to impact.
      • Without objectively monitoring judgement accuracy, there isn’t a good way to tell whether requiring analysts to be “rigorous” is having the desired effect.
      • Analytic rigour is producing a defensible and relevant judgement based on a consideration of widest possible information within a predetermined timeframe.
      • Elements of rigor exist on a spectrum as indicators of sufficiency of: objectivity, thoroughness, transparency, credibility, relevance, and replicability/reliability/validity
      • Analytic rigor requires a delicate balance between thoroughness and relevance to the task at hand. The most thoroughly done analysis may fall short if it does not adequately address the issues.
      • “””If it disagrees with experiment, it’s wrong. In that simple statement, is the key to science.”” ~R. Feynman Also, “”you must not fool yourself – and you are the easiest person to fool.”””
      • Rigour requires an awareness of common heuristic biases and where possible an attempt to overcome them.
      • A genuinely analytically rigorous text will successfully consider and account for the impact of the writer’s perspective on the content.
      • Analytic rigor embodies a critical examination of all components of an issue and/or question, the relationships between these components, and the underlying assumptions
      • Analytic rigour requires a critical examination of the actual and potential biases within our information collection and evaluation process as this forms the foundation for any subsequent judgement.
      • Analytic rigour requires the analyst to be mindful about what s/he is bringing to the table (biases, etc.). That is, it requires a great deal of self-awareness on the part of the analyst.
      • Analytical rigour includes the consideration of alternative hypotheses and explanations; asking “what if we were wrong?” and taking a devil’s advocacy approach to flip something on its head.
      • Genuinely analytically rigorous texts should take into account a diversity of viewpoints and possible takes on the subject matter, including the possibility of human error and flawed thinking.
      • In this sense, rigor implies a healthy dose of humility as well – by ‘rigorously,’ or carefully making our uncertainty and our biases explicit and being willing to be wrong.
      • Rigour requires an awareness of common heuristic biases and where possible an attempt to overcome them.
      • Rigour, for me is thoroughness. It the process of reaching and clearly articulating what was determined and how it was arrived at whilst being open minded and aware of possible biases.
      • Analytical rigour is not a destination but an ongoing process of continuing improvement. Its achieved via an iterative process employing cognitive and methodological processes.
      • Analytic rigour is characterized by an ambition to reduce the impact of bias on the individual, group and organizational level.
      • Analysts must be constantly supported and encouraged to be objective, challenging and curious. Otherwise, they will tend to rely on past assessments and dismiss indicators which contradict them.
      • Analytic rigor is closely linked to analytic integrity and objectivity.
      • Analytic Rigour = Objectivity, thoroughness, relevance, logical and heuristic, truthful/accurate/believable, authoritative, anticipatory
      • Analytic rigour is not simply trying to be ‘objective,’ or following a set of procedures or tools without an understanding of how those tools actually contribute to rigour.
      • Analytic rigour is sometimes framed as a means of ensuring “objective” analysis. This obscures the fact that expert judgements made under conditions of uncertainty are inherently subjective.
      • Elements of rigor exist on a spectrum as indicators of sufficiency of: objectivity, thoroughness, transparency, credibility, relevance, and replicability/reliability/validity
      • Rigor is different from tradecraft standards. Tradecraft standards include objectivity and accuracy which are aspirational goals rather than standards. Rigor can’t achieve an aspirational goal.
      • COMMENT: Long debates about the definition of academic rigour may not have any significant impact on real world intell practice. This said, intell work is not ‘objective’ but inherently subjective (see Marrin’s argument). The aim is sound but ultimately unachievable & belief that intell staff are ‘objective’ is both common & potentially problematic.
      • A rigorous analytic methodology is a mix of inductive, deductive and abductive reasoning together with creative and critical thinking skills.
      • Analytic rigor embodies a critical examination of all components of an issue and/or question, the relationships between these components, and the underlying assumptions
      • Analytic rigor requires critical thinking. Thus, we must define, understand, and apply the appropriate elements of critical thinking in our analysis. Analytical rigor isn’t merely being disagreeable.
      • Analytic rigour in intelligence agencies has vastly improved in the last 15 years, primarily due to an increased organisational training focus on critical thinking
      • Analytic rigour to me is about deep thinking, using logical reasoning and reflection as a central part of the evaluation of data and information that is enabled through
      • Application of structured analytic techniques (SAT) can encourage analytic rigour but does not guarantee it. Still requires analyst to demonstrate curiosity and criticality of thought.
      • Concepts like “critical thinking” and “insight” are important, but so loosely defined as to be meaningless.
      • Analysis must be effectively challenged on a routine basis. But agencies must find a means to make challenge constructive, without undermining analysts confidence and limiting their imagination.
      • Analytic rigor requires critical thinking. Thus, we must define, understand, and apply the appropriate elements of critical thinking in our analysis. Analytical rigor isn’t merely being disagreeable.
      • The early part of framing an analytic scope, questions, and specialists to include is aided by collaboration with trusted peers with diverse backgrounds
      • Analytical rigour has to be understood and supported by the entire management structure; it cannot simply be something that analysts are expected to get right on their own.
      • Analytic rigour takes into account multiple perspectives and possibilities. It requires collaboration and coordination to the extent possible.
      • “””A proof … convinces a reasonable man; a rigorous proof … convinces an unreasonable man.”” ~M. Kac What would it take to convince your adversary? Evidentially that is.”
      • Analytical rigor always occurs within an organization, country, & perceived world situation; requires executive to encourage ongoing standards, evaluation, and research, & much collaboration
      • Analytic rigour is exhibited in the explicit argumentation and assessment of sources and information. It can be recorded and documented.
      • Rigorous assessments communicate uncertainty in clear and structured ways. This can involve the use of numeric percentages or clearly-defined “words of estimative probability” for all key judgments.
      • Analytic rigour is not about having high confidence in findings. An analyst can have low confidence in key judgements but still employ analytic rigour if they identify and understand limitations.
      • The best we can hope for is acknowledging uncertainty and becoming comfortable with it with adaptive forecasts that account for unacknowledged variables and unlikely outcomes.
      • If the analytical product to some extent rests on uncertain assumptions or speculative conclusions, this does not render it useless, but uncertainty must not be concealed by an illusion of certainty.
      • Clear declaration (and understanding) of areas of uncertainty upon which judgements are made.
      • Beliefs may properly belong in an analytically rigorous product, if they are accounted for as such.
      • Analytic rigour is not about increasing certainty. It is about understanding the strengths and weaknesses of a judgement to help decide where more effort might benefit, and supporting good decisions.
      • Rigorous assessments distinguish between probability (the chances that a statement is true) and confidence (the extent to which analysts have a sound basis for assessing uncertainty).
      • Analytical rigour is an overall approach to analysis that stresses systematic thinking combined with steps to make the rationale for judgements more ‘transparent.’
      • Is essential for strong argumentation (logic) and maximising the likelihood of producing an accurate assessment. Also essential for generating a comprehensive audit trail.
      • Elements supporting rigour – contestability (review process), analytic tradecraft, accountability, reproducibility (record keeping of sources and methods), analytic standards, subject matter expertise
      • In this sense, rigor implies a healthy dose of humility as well – by ‘rigorously,’ or carefully making our uncertainty and our biases explicit and being willing to be wrong.
      • Too much is still implicit; need to make all elements very explicit (certainly assumptions of all types) and hold up to critical view in clear daylight for all to analyze and critique.
      • Rigour, for me is thoroughness. It the process of reaching and clearly articulating what was determined and how it was arrived at whilst being open minded and aware of possible biases.
      • Analytical rigour is an overall approach to analysis that stresses systematic thinking combined with steps to make the rationale for judgements more ‘transparent.’
      • Analytic rigour rests first of all on transparency in all aspects: assumptions, the nature of the data (and its potential shortcomings), and methods (as well as their potential shortcomings).
      • The willingness to show what you are thinking and how you got there is key for trust and making it better than just pure opinion.
      • Intelligence leaders and Commanders need insights into analytic process (i.e. measures of analytic rigor) to interpret trustworthiness of intelligence assessments.
      • Analytic rigour requires an open and honest explanation of the inferential process used to draw the judgement from your information base. Greater rigour will likely require greater analytic effort.
      • Elements supporting rigour – contestability (review process), analytic tradecraft, accountability, reproducibility (record keeping of sources and methods), analytic standards, subject matter expertise
      • At heart, analytic rigour is about being able to explain our assessments and the process through which we came to them
      • COMMENT: My comments on ‘transparency’ as a key element of analytical rigour require elaboration, as this term is used other contexts with somewhat different meanings. Overall, analytic transparency is aimed at making the information and thinking that underpins the analysis more explicit at all stages of the process. It involves the adoption of a range of tools and practices. At the analyst level, it requires a more conscious understanding of how and why they reached a particular conclusion: a clear tracking of the relevant information used, the assumptions involved and of the thought process leading from information/assumptions to the conclusion (i.e. self-awareness of the cognitive process) . This in turn should make the analysis transparent to the collaborators working with the analyst on an assessment, to the manager reviewing the analysis, and to members of other organizations who are involved in inter-departmental review. With a better understanding of how and why the assessment process reached a particular conclusion, it should also be able to present the findings to the client in a more convincing form. By using greater precision in describing the judgement, there should be a greater chance that the client will interpret it in the way it was intended to be understood. Many of the tools that contribute to transparency (detailed source noting, specifically identifying the judgements in an assessment, lists of assumptions,key factors etc., identifying chains of logic, the use of numeric probabilities, and so on) are already used to some extent in many analytical organizations; what I am suggesting is that such tools be employed within an overall mental framework with the specific aim of making the thinking process more transparent at all stages.
      • At heart, analytic rigour is about being able to explain our assessments and the process through which we came to them
      • In this sense, rigor implies a healthy dose of humility as well – by ‘rigorously,’ or carefully making our uncertainty and our biases explicit and being willing to be wrong.
      • Too much is still implicit; need to make all elements very explicit (certainly assumptions of all types) and hold up to critical view in clear daylight for all to analyze and critique.
      • A key element is analytic ‘transparency’ which strives to make clear to the analyst, collaborators, manager and reader the full range of information, assumptions, etc. that underpin the conclusions.
      • The duel purpose of AR is to result in more accurate assessments and to more transparently display the justification of those judgements to the reader. Both purposes are equally important to impact.
      • If the analytical product to some extent rests on uncertain assumptions or speculative conclusions, this does not render it useless, but uncertainty must not be concealed by an illusion of certainty.
      • Analytic rigour requires a consideration of as many sources of information as possible, a clear audit-trail of an analysts thought process, and a delineation between facts, assumptions, and judgements
      • Analytical rigour is an overall approach to analysis that stresses systematic thinking combined with steps to make the rationale for judgements more ‘transparent.’
      • Analytic rigour requires an open and honest explanation of the inferential process used to draw the judgement from your information base. Greater rigour will likely require greater analytic effort.
      • Elements supporting rigour – contestability (review process), analytic tradecraft, accountability, reproducibility (record keeping of sources and methods), analytic standards, subject matter expertise
      • At heart, analytic rigour is about being able to explain our assessments and the process through which we came to them
      • The willingness to show what you are thinking and how you got there is key for trust and making it better than just pure opinion.
      • Rigorous assessments describe the extent to which reasonable people can disagree with key judgments, and why different assumptions could plausibly lead to different conclusions.
      • Too much is still implicit; need to make all elements very explicit (certainly assumptions of all types) and hold up to critical view in clear daylight for all to analyze and critique.
      • Rigour, for me is thoroughness. It the process of reaching and clearly articulating what was determined and how it was arrived at whilst being open minded and aware of possible biases.
      • Analytic rigour requires a consideration of as many sources of information as possible, a clear audit-trail of an analysts thought process, and a delineation between facts, assumptions, and judgements
      • At heart, analytic rigour is about being able to explain our assessments and the process through which we came to them
      • Having identified Subject Matter Expertise, I’d also suggest that it is important to ensure the intelligence is easily understandable to non-Subject Matter Experts to avoid jargon.
      • The effectiveness of communication of results may impact how policymakers perceive analytic rigor, and we should further explore whether how results are presented is in fact a part of rigor itself.
      • Intelligence is about helping decisionmakers make better decisions. Analytic rigor means identifying core informational needs, choosing relevant data, assessing them, and communicating effectively.
      • At the highest level it is resistance to sincere criticism. Akin to how good scientific theories are resistant to sincere attempts to falsify.
      • Analytic rigour is manifest in the confidence, authority and comportment an analyst displays when briefing a superior. Put differently, rigour is the absence of ‚Äúerr, umm, ahh‚Äù.
      • Analytic rigour is producing a defensible and relevant judgement based on a consideration of widest possible information within a predetermined timeframe.
      • Analytic rigour means that the analytic process and resulting product can stand up to objective scrutiny in regard to sourcing, methods, and conclusions.
      • Analytical rigour should be methodologically defendable but because of many unknown, uncontrollable variables and time limitations it cannot achieve the same standards as STEM or SBS research fields.
      • Analysis is Argument. Claism must be justified through proper argument and should be assessed against criteria of : Structure, Sufficiency, Acceptability, Relevance, Susceptibility to Rebuttal
      • Analytic rigour is producing a defensible and relevant judgement based on a consideration of widest possible information within a predetermined timeframe.
      • Analytic rigour is not about increasing certainty. It is about understanding the strengths and weaknesses of a judgement to help decide where more effort might benefit, and supporting good decisions.
      • Analytic rigour requires an open and honest explanation of the inferential process used to draw the judgement from your information base. Greater rigour will likely require greater analytic effort.
      • Analytic rigour increases the intellectual challenge of an analytic task. It pushes the analysts out of their comfort zone.
      • Rigor is an effort by an analyst or researcher to be as complete as possible in order to arrive at the most accurate assessment/results possible in conducting an analysis with integrity.
      • As well as analysis being independent from government/politics, it must also be independent from their own organisation. Analysts judgements may go against previous reporting.
      • A rigorous analytic product should not include any phrases or conclusions pushed by disinformation campaigns by nefarious or deceptive actors
      • Believe that creativity is a key component of analytic rigor…being able to imagine alternatives and options can be taught by learning to think of ends of axes and asking why to get a base drivers.
      • Analytical rigor doesn’t prevent or inhibit creative thinking. Rather the two are complementary and mutually supporting not mutually exclusive.
      • data+analyze+reasoning+judgment is necessary.For many situations, also need quality unknown/unsuspected alternative explanations. Possible explanations outside norm is creative act; need this w rigor
      • Applying key concepts, analytic methods and techniques that have been tried and tested but also innovating to explore new and vital ways of collecting and interpreting data.
      • Analytic rigour is, in large part, thorough abductive reasoning, which permeates intelligence analysis. High analytic rigour is in other words good abductive reasoning on issues in intelligence.
      • Analysis is Argument. Claism must be justified through proper argument and should be assessed against criteria of : Structure, Sufficiency, Acceptability, Relevance, Susceptibility to Rebuttal
      • Analytic rigour is exhibited in the explicit argumentation and assessment of sources and information. It can be recorded and documented.
      • Analytic rigour is using reasoning to accurately determine what conclusion can be drawn from the available information, and to do so as precisely as is warranted by the information.
      • Analytical rigour is the effective use of processes, methods, tools and techniques to show logical and coherent reasoning.
      • data+analyze+reasoning+judgment is necessary.For many situations, also need quality unknown/unsuspected alternative explanations. Possible explanations outside norm is creative act; need this w rigor
      • Dedication to demonstrating a clear chain of logic. Ensuring that all contentions are supported by evidence and relevant argument.
      • Is essential for strong argumentation (logic) and maximising the likelihood of producing an accurate assessment. Also essential for generating a comprehensive audit trail.
      • Justified True Belief. Analysis is about making truth claims. In practice this means justifying belief. We do this using informal logic.
      • Some quantifiable criteria for analytic rigour can be established, but this is an insufficient definition. Some other aspects may be overriding; for example, faulty reasoning with a false conclusion.
      • Analysis is the Trivium. Grammar-Logic-Rhetoric. It isn’t complicated. Transform language statements (Grammar), into a deliverable product (Rhetoric), using the engine of Reason (Logic)
      • Analytic rigor is deliberate formulation of the intelligence question being asked, gathering and objective review of the available evidence, and clear, unbiased, and timely response to the asker.
      • Analytic rigour is tied to customer’s needs.This impacts on the intel question and the timeframe to address it. An analytically rigorous product that doesn’t meet a customer’s need is irrelevant!
      • Analytic rigour relies on normative standards and expectations for the analytic product, procedure and process. It cannot be based on purely descriptively derived notions or conceptions.
      • Analytical rigour is key to analysis being accepted by customers
      • AR can devolve into internal stage gates more concerned with satisfying organisational conventions rather than satisfying the needs of the intended consumer.
      • Clarity of mission/topic. Analytic rigour begins with ensuring a clear understanding of the brief and scope of work, and an understanding of the risks to be addressed from the customer’s perspective.
      • Intelligence is about helping decisionmakers make better decisions. Analytic rigor means identifying core informational needs, choosing relevant data, assessing them, and communicating effectively.
      • Know the Consumer. Some like stories, some like pictures, some like numbers. Know your audience.
      • Rigour improves quality but rigour alone is insufficient to persuade decision makers.
      • The effectiveness of communication of results may impact how policymakers perceive analytic rigor, and we should further explore whether how results are presented is in fact a part of rigor itself.
      • The most rigorous analysis in the world has little to no value if no-one decision-makers have any interest in it. Client need should invariably be established early on.
      • The ordering of two perspectives on a conclusion and possible recommendations can be determined by the stance of the customer without losing rigor, but increasing receptivity to the conclusions.
      • Analytic rigour requires accurately representing the strength of the evidence and its limitations to make warranted assessments.
      • Everybody says intelligences is about increasing confidence, but sometimes it is about reducing overconfidence by reminding decision makers of complexity and randomness.
      • Analysts judgements and analysis should be as specific as possible, even if this results in lower confidence or going further beyond the intelligence than is usual.
      • Analytic rigour is producing a defensible and relevant judgement based on a consideration of widest possible information within a predetermined timeframe.
      • Focus: risk, probabilities, subjective weighing of alternatives, declaring unknonwns, uncertainty, & unclarities especially of sources & synthesis often with narrative
      • Analytical rigour is about reducing uncertainty and improving the validity and reliability of probabilities that support analytical judgements.
      • Understanding the level of analytic rigour allows us to better outline the strength of a judgement. This might make decision making more complex rather than less, but is key to effective decisions.
      • Rigorous assessments describe the extent to which analysts’ judgments could change in response to new information.
      • Analysts judgements and analysis should be as specific as possible, even if this results in lower confidence or going further beyond the intelligence than is usual.
      • Analytic rigor embodies a critical examination of all components of an issue and/or question, the relationships between these components, and the underlying assumptions
      • Analytic rigour is using reasoning to accurately determine what conclusion can be drawn from the available information, and to do so as precisely as is warranted by the information.
      • Applying structured analytical techniques does not ipso facto equate to analytical rigour.
      • Application of structured analytic techniques (SAT) can encourage analytic rigour but does not guarantee it. Still requires analyst to demonstrate curiosity and criticality of thought.
      • Rigor is about analytic process under limited time, resources, noise, & uncertainty. What increases or decreases rigor? What is sufficiently rigorous? What extra would be
      • 1) Checking key assumptions. 2) Thinking of alternative hypotheses. 3) Seeking inconsistent rather than consistent data. 4) Identifying drivers that underlie the issue. 5) Understanding the context.
      • Analytic rigor is deliberate formulation of the intelligence question being asked, gathering and objective review of the available evidence, and clear, unbiased, and timely response to the asker.
      • Analytic Rigour = Objectivity, thoroughness, relevance, logical and heuristic, truthful/accurate/believable, authoritative, anticipatory
      • Elements are: 1) identify key facts, 2) understand relevance, 3) apply concepts to specific situations, 4) organize efforts, 5), synthesize parts to derive meaning, & 6 assess validity of the process.
      • COMMENT: Analytic rigour is comprised of multiple elements. Inclusion of the term ‘analytic’ could be taken to imply that it only applies to the narrowly analytic element (i.e. the inferential development of the information base). However, it should incorporate evaluation of: the information base; the inferential process; the appearance and communication of the findings (i.e. are they drafted and presented in a clear and transparent manner).
      • COMMENT: Must use clear communication, written, oral, visual, multimedia, and probability; avoid bias; identify gaps
      • Confronts info-overload, data storage & organization, political pressure, complexity, denial & deception, cyber-war, security, info-war, moles, & existential risk to humanity
      • Elements of rigor exist on a spectrum as indicators of sufficiency of: objectivity, thoroughness, transparency, credibility, relevance, and replicability/reliability/validity
      • Elements supporting rigour – contestability (review process), analytic tradecraft, accountability, reproducibility (record keeping of sources and methods), analytic standards, subject matter expertise
      • Focus: risk, probabilities, subjective weighing of alternatives, declaring unknonwns, uncertainty, & unclarities especially of sources & synthesis often with narrative
      • Intelligence is about helping decisionmakers make better decisions. Analytic rigor means identifying core informational needs, choosing relevant data, assessing them, and communicating effectively.
      • It’s not; guesswork, biased, cultural/agency blindness, incomprehensible, a mere collection of A1 human sources or academic citations
      • misconceptions of analytical rigour; there is a formular/algorithm for it, it can be taught in a week or a book, only those with degrees can do it, it cant be illustrated with diagrams/pictures/chart
      • Rigour requires hard-thinking, creativity, tenacity, self-refection, openness, sensitivity to outliers, a fondness for argument, humility, methodological training, curiosity, and perspective.
      • Simplistic measures, such as arriving at the “right” conclusion, time spent on analysis, number of hypotheses, or the tools used, may not encompass the full breadth of what rigor actually entails.
      • Thorough, consistent, reliable, accurate analysis w ability to aggregate across many people, & make all clear & explicit; this demands structured automated reasoning tool with all elements transparent
      • Thoroughness is achieved through skilled navigation of multiple dimensions of analysis, to includ data gathering, source diversity and credibility, hypothesis generation, corroboration, and synthesis.
      • ‘confidence’ judgements are too vague. The multi-attribute metric provides means to define rigor hat is sharable – debate/critique – and actionable/contextualized to the kind of analytic work
      • My studies show 8 dimensions define rigor given the above. It is operationalized by the ‘supervisor’s dilemma’ critiquing judgment. The multiple attriibutes provide a model of sensemaking in analysis.
      • COMMENT: Just want to footstomp the critical importance of finding and developing the middle ground where the quantitative and the qualitative can flourish and reinforce the quality of the analysis.
      • The joy of analytic rigor is that it is the clarity that comes from combining the discipline of the quantitative with the narrative flow of the qualitative.
      • Analytic rigour cannot be entirely automated.
      • Analysis is often a non-linear process. The sort of intellectual reductionism required to imagine workflows flys in the face of actual cognitive processes. Unlike nonlinear streams of consciousness.
      • Hidden gems: one “small” item that runs counter current thought or that is very quiet among the noise. Can drive new path. Need way to ensure these are ferreted out and supported; with assurance.
      • Defining our taxonomy is a necessary first step, so we don’t talk past each other. For instance, many analysts use the term “critical thinking” but apply the term as if it means being skeptical only.
      • There is no agreed professional or domestic standard. Even where standards exist on paper, and people claim they are used, it’s often not true.
      • Analytic rigour is the willingness to ask questions when others have fallen silent and are staring at their watch.
      • Analytic rigour, to paraphrase an idea from operations analysis, is the sine qua non of intelligence work (Mathieson, 2000). But work is needed to clarify what this specifically means for your org.
      • Analytic rigour is the ability to think and talk in paragraphs. It is the ability to discuss an issue descriptively, predictively and prescriptively.
      • Half the time, rigour yields perspective and understanding. The other half it yields insight or surprise. Absent these outcomes, one is absent rigour.
      • It includes various elements, including a mental approach to analysis, as well as specific practices and tools.
      • Common misconception about analytic rigour would be that more information – more intelligence collection – leads to better analysis.
      • There is no set methodology or template for products. Academic rigour must take have a template or control measures in place that limit the way intelligence is written or briefed.
      • Analytic rigour means writing policy relevant assessments not policy driven assessments.
      • Analysis is Spatial. Pure verbiage becomes conflated and indistinct. Premises, claims, objections, rebuttals, are difficult to parse. Arranging these constituents spatially allows decomposition.
      • Analytic rigor does not just happen. It has to be trained formally in a classroom setting AND be reinforced in the workplace with on-the-job application. It is not just journalism applied to intel.
      • Analytic rigour is producing a defensible and relevant judgement based on a consideration of widest possible information within a predetermined timeframe.
      • Uses many disciplines, models, and methods from the natural, cognitive & social sciences, engineering, tech, & history
      • We don’t use a methodology to convert qualitiative intelligence into a quantitive probability estimate, so the analytic yardsticks used are finger in the air, what sounds right, not a calculation.
      • Synthesis is challenging, but quite possibly the most important aspect of rigor.
      • It is an open-minded, thoughtful approach to an analytic issue.
      • Rigor is not the same as tradecraft. Tradecraft is how to do the job‚Ķessentially informal doctrine. But tradecraft can be implemented rigorously or not. So rigor is something other than tradecraft
      • Sometimes the word “rigour” is understood to mean “inflexibility” or “rigidity,” and I would say that’s something analytic rigour is NOT.
      • Everybody says intelligences is about increasing confidence, but sometimes it is about reducing overconfidence by reminding decision makers of complexity and randomness.
      • There is a distinct gap between policy and leadership expectations for ‘rigor’ and what can actually be accomplished by an analyst.
      • Intelligence can never be truly complete, otherwise, it wouldn’t be intelligence – it’d be a known fact, or in other words, science.
      • Concepts like “critical thinking” and “insight” are important, but so loosely defined as to be meaningless.
      • Analytic rigour is not in tension with insight. Analysts need to apply rigour to their analysis of evidence to draw the greatest insights from it.
      • A common misconception around rigour is that the analytic process is a scientific process. While some issues lend themselves to a scientific process, many do not.
      • Misconceptions permeate an engineering approach, what’s needed is actual science. Hypotheses that can withstand refutation or not and are used to inform more hypothesis generation.
      • Base rates are an intuitively powerful idea, but I am yet to see anyone explain in detail how they can be applied by the average analyst to a typical intelligence question
      • COMMENT: “Given the extensive experience of others on the panel on related subjects and my relative lack thereof I’m sure that any contribution I could make to this phase of the study would rate as obvious.
      • Intuition is not bad. Consider chess masters, who intuit the right move from a board position, while computers “brute force” check every possible move. Yet intuition comes from deep experience.
      • Of course, that’s usually impossible, particularly for human affairs – so we narrow our variables to those we consider most significant, and select our sources with a bias, conscious or not.
      • COMMENT: Still part art and part science. Can (and should) bend it more toward science with automated tools. My company, findingQED was created with this and the issues above in mind. Look forward to discussing.
      • COMMENT: I hope to be more helpful next week.”
      • Thinking like a statistician is helpful; doing math like a statistician is not necessary unless information can onky be made sense of through statistical analysis.
      • The best rigorous minds have historically been found in science and humanities (in my view!). Why is that? Can it be brought to bear on this enterprise?
      • COMMENT: Telling analysts to be “more rigourous” is like telling kids to eat more vegetables.
      • COMMENT: I prefer the word anticipate vs predict. Anticipate has an active response connotation to it, without a commitment to perfection. I don’t know what the enemy will do exactly, but I anticipate his most likely future actions and have a plan which accounts for whatever he does. The key is to not have unknown unknowns.
      • COMMENT: The intelligence analyst has some additional challenges beyond that of most others. While many struggle with lacking complete information, the intelligence has the additional burden of working against an adversary who is deliberately concealing information and/or providing false/disinformation. Therefore, the intelligence analyst must have a rigorous process to account for this.
      • COMMENT: On the conversion of qualitative intelligence into quantitive probabilities – I have seen one agency use a model of decision trees, where for a given scenario they map out how many possible actions there are, and of those how many would result in their judgement (eg 15/20 end results from different decisions end in our judgement, so it is a ~75% chance). This runs a risk of being falsely considered statistically rigorous or being overconfident, but it is at least a start.
      • Decision makers can confuse illustrations and mathematical equations for rigor.
      • You realize you are tugging on a strand of a bigger issue: what is the intelligence enterprise today and who does it serve?
      • Analysis is normally provided from a mix of different security classifications, which have been implied at lower classifications so that the data that provided supporting findings) is not mentioned.
      • “Consider how the profile of the typical analyst has changed in 75 years – a function of an expansion in numbers; changes in higher education; diversity etc. This matters.
      • One of my favorite articles from grad school was about horse racing handicappers. they were not better at predicting having 10 or 15 points of evidence vs 5 to consider, but were more confident.
      • Improving analytical rigour should where relevant use quantitative and qualitative methodologies but there will always be limits to how they can be used given the complex security environment

Factors contributing to analytic rigour

      • Cognitive biases are detrimental, and very hard to avoid, but of great influence on rigour.
      • Awareness of bias in thinking – rigour will be improved if analysts are open to the idea that their thinking may be skewed by their experience or situation and take steps to address this
      • Awareness of cognitive biases. I have seen good analysts discredited in the eyes of managers because of recurring themes of obvious personal biases creeping into their work. Needs training & feedback
      • I – Independence – captures the aspiration for analyses to be free of bias and prejudice. Whilst difficult to achieve in practice, much can be done to mitigate the risks of bias and prejudice.
      • Bias can be good. A learned bias (one acquired through experience) such as preference for HUMINT over SIGINT, could serve an analyst well. We shouldn’t throw the baby out with the bathwater!
      • Objectivity. Inviting challenge, understanding the human tendency for bias and groupthink and taking steps to mitigate them.
      • “Biases are a crucial challenge to analytic rigour. However, biases cannot be removed through training, merely mitigated. This concerns individuals, groups as well as the organization as a whole.
      • The study of cognitive bias is the single most detrimental factor affecting analytic rigour. We have paralysed a generation of professionals, stunted their ability to think effectively.
      • Heuristic bias has been present in every intelligence failure studied by academics. It is the most detrimental and the most difficult to overcome.
      • Perceiving a certain outcome is desired or a conscious or unconscious desire to reach a conclusion, including being wed to a certain theory or perspective, will reduce analytic rigor.
      • Understanding of cognitive bias and heuristics. Not just knowing the names, but understanding the mechanisms and how they can twist the analytic effort.
      • Bias, in any of its forms will degrade the value of the intelligence product arising from the analytical effort. This is compounded by failure to collaborate, engage and innovate with peers in analysi
      • Organisations should ensure interview and assessment process accurately tests the analytical skills needed within their organisation. Accepting poor quality candidates reduce analytic rigour.
      • Futures analysis is hard to discern but very influential. This must again be based on strict subject matter knowledge.
      • Using data without stats understanding gives a false impression of rigor. Errors as simple as not weighting for population density/inflation, to not understanding data issues e.g. Simpson’s paradox.
      • The analysts’ ability to implement analytic rigour depends on their ability to think critically, creatively, and conceptually. Critical thinking is important but not enough in itself.
      • The enthusiastic absorption of “data analytics” without reflection can only be bad for rigor. What does it mean to look at an aggregation of thousands of tweets? What is that evidence?
      • Competence/skill: you’ve got to have the ability to do the core work properly. Did you collect the right data? Properly? Run the right analyses, correctly? “Basic” block & tackle.
      • Poor understanding of how to think about and communicate uncertainty and epistemic probability.
      • Persons’ personal characteristics are critical. These have been noted by analysts such as Grabo, Tetlock, Whaley, Betts, Soviet active measures personnel, and others. Recruting such people is key.
      • Creative and critical thinking: a mix of skills and culture. How and when are both creative and critical inputs taken in the process?
      • Understanding of Argument. To include the nature of fallacy, weak argument.
      • Accepting and managing the inherent uncertainty through understanding gaps in coverage, analysis, and broader understanding. WIthout this you cannot have high levels of analytic rigour.
      • Reasoning skills: inductive, deductive, abductive or preferably a mix? Additionally, falsification is a good principle, but must not be dogmatic or naive.
      • The ability to be literate in research methodologies and testing analytical judgements through research and internal/external IC peer review promotes rigour.
      • COMMENT: Importance of being able to structure a clear argument: a judgment or conclusion supported by reasons that in turn are support by evidence.
      • The extent and quality of knowledge an analyst has is the single most important enabler of higher-level thinking and analytic rigour. You need quality bricks to build a sturdy wall.
      • Historical knowledge – the richer one’s knowledge of history, the greater the nuance, perspective and authority one can bring to an analytic task.
      • A keen sense and education in the subcultures in question is good – whether regional, professional or otherwise
      • Nuances of language and culture can confound analysis, even with a reasonable baseline of knowledge. A particular word, translated, may have a very different meaning and significance.
      • The lack of historical knowledge – it is unconscionable that we allow people who have no knowledge of history (or at least of 20th century history) to serve as analysts in a national security domain.
      • Hard to discern, but still influential is the cult of the expert. SMEs who use their personal authority to hinder objective, tradecraft-based analysis.
      • mis-reading intent of others is a significant weakness in analysis; good analysts are skeptical of judgments of intent & do extra work to corroborate.
      • COMMENT: Better analysts have healthy skepticism leading to extra checks on multiple dimensions and they understand what constitutes a good check or way to get corroboration
      • increase rigour; curiosity, time to think deep, experience/exposure to subject matter, understanding of social science and psychology of communicating with influence
      • Curiosity – Experience suggests those most comfortable with uncertainty are also those most willing to push at the limits of their knowledge.
      • A lack of intellectual curiosity, an inability to self-reflect or interest in problem solving prevent analytic rigour.
      • Cognitive style: Individuals vary in dispositions (e.g., “foxes” vs. “hedgehogs” or natural inclinations toward overconfidence) that can systematically affect the quality of analysts’ judgments.
      • Analysts need to be mindful of their strengths and weakness, and be able to adjust their analysis in light of these. For example, biases can be very detrimental if not held in check or mitigated.
      • Innate intelligence and a capacity for discernment. The mind that can reflect on its own movements can figure out how to be rigorous. The problem is, this is a difficult to capture form of thinking.
      • Objectivity / honesty / integrity: All reflect Feynman’s essence of science: test your theory; (2) don’t fool yourself. Most Intelligence theories are vague, so can’t be tested well. Temper claims
      • Objectivity / honesty / integrity: All reflect Feynman’s essence of science: test your theory; (2) don’t fool yourself. Most Intelligence theories are vague, so can’t be tested well. Temper claims
      • Clear and explicit understanding of the rigorous thinking required of the role. Practitioners need to have highlighted to them that THINKING is at the core of the job, embrace that and train for it.
      • A factor that improves analytic rigour is self-awareness on the part of the analyst, which helps to pull out biases, assumptions, and faulty thinking.
      • Mindset is vitally important in shaping AR, and is influenced by professional development, not necessarily undertaken in the intelligence field. Fortune favours the prepared mind.
      • The right mindset and personality are key prerequisites to determining the analyst’s ability to build a solid knowledge base and process knowledge in an analytically rigorous manner.
      • Analysts need to feel a responsibility toward the professionalization of their craft.
      • Imagination/curiosity – analysts need to be able to think of alternative explanations or interpretations, or seek them out, or at least recognise that alternatives are possible
      • Characteristics that could affect rigor include personality type of the analyst and degree of conscientiousness.
      • Comfort with uncertainty – not knowing is the default condition for analytical work. Those who are comfortable with uncertainty use it as a spur to learning. Those who are not, are a liability.
      • Imagination/curiosity – analysts need to be able to think of alternative explanations or interpretations, or seek them out, or at least recognise that alternatives are possible
      • Actively trying to raise your standards through use of feedback loops and self-reflection. Analytic rigour will not improve without effort and conscious thought.
      • Richard Betts identified two types of analysis: “normal theory” and “exceptional thinking.” A core goal of this project should be finding ways to identify and nurture “exceptional thinking.”
      • We should not disregard the existence of ‘gifted’ intelligence officers, with innate but difficult to capture dispositions and approaches.
      • Innate ability – some individuals are naturally more disposed to seeking alternative explanations (more open-minded, inquisitive etc) but this can be hard to determine in current recruitment practices
      • Unusual mental/character traits and special skills are especially important is new situations and in analysis related to unusual topics, such as HUMINT targeting, warning, deception, counterintelligence, and covert political action. Standards such as ICD 203 hinder analyses in these “non-traditional” areas. For example, former US national intelligence officer for warning Ken Knight reported that in roughly 2008 his warning messages received poor marks by the ODNI because they contained to few facts. This standard is usually reasonable for current intelligence, but hurts warning analyses that inherently must be more speculative.
      • A lack of training in language and writing skills. To facilitate the thinking required and maximize its acceptance by the end user, the intel practitioner must have very well developed writing skills.
      • Bad: positivism in general. More data doesn’t mean better data. Is it even data?
      • Poor or inaccurate data collection driven by inadequate prioritisation or collection management; skating across the surface of the issue without delving deeply in to the question/problem and the data
      • G – Grounding in observed reality – this focusses on the need to draw on data to ensure that conclusions are soundly based. Theories still have a place here, when used in combination with data.
      • Lack of controls in data collection and interpretation will lead to poor reporting; analytic rigour also amplifies integrity and honesty in the judgements arrived at through analysis of the data colle
      • Increasing analytic thoroughness requires data, data, and data. Coupled with teams of people who think orthogonally.
      • Also the very real limitation of simply being unable to process the vast amount of information available in the global datasphere, leaving any intelligence research incomplete almost by definition.
      • Quality and quantity of information. Good quality info is always good. Large quantities of info can hinder because of the sorting problem but, with the right tools, can improve rigour.
      • The availability of relevant information; the more the better.
      • Quality and quantity of information. Good quality info is always good. Large quantities of info can hinder because of the sorting problem but, with the right tools, can improve rigour.
      • The quality of analysis depends on two factors: The quality of reasoning and the quality of the evidence used. Both are needed, and you can’t compensate the lack of one with more of the other.
      • Analytic rigour is not necessarily about getting the perfect or “right” answer. Imperfect information (quantity and quality), time constraints and other factors will always prevent this.
      • The quality of analysis depends on two factors: The quality of reasoning and the quality of the evidence used. Both are needed, and you can’t compensate the lack of one with more of the other.
      • Time is the biggest factor impeding analytic rigour. Most often, we are responding to tasks within short time frames. Completing a task on time outweighs analytic rigour
      • Rigor is relative to time/resources, but advanced preparation allows for more effective rigor when time constrained environments happen. Iterative training is essential to maximizing time available.
      • Time – rigourous analysis is less likely if there are time pressures on producing assessments. Can be mitigated by using structured approaches as well as innate or learned behaviours
      • Analytic rigour is not necessarily about getting the perfect or “right” answer. Imperfect information (quantity and quality), time constraints and other factors will always prevent this.
      • Speed is not a detriment – you can be rigorous at pace and still deliver for customers.
      • Time demands on analysts – the need to respond to tasking quickly – is the biggest constraint on analytic rigour
      • AR must be understood in terms of the problem type, e.g. puzzles (to be solved) or mysteries (to be framed) and tradeoffs between timeliness and thoroughness.
      • Time: Time pressure generally reduces opportunities for conducting rigorous analysis.
      • Unreasonable production requirements and timelines encourage cognitive and process shortcuts that are detrimental to analytic rigour
      • Timeliness – it can harm rigour if timeframes are short but a rigorous product that misses a deadline is irrelevant. Identifying the “80%” solution is key
      • What’s most detrimental to rigor is time, or rather the lack thereof. The pressure on analysts to rapidly provide ‘actionable’ intelligence forecloses on rigor.
      • We’re often left with only an approximation of rigor, or rigor to the extent allowed by time constraints.
      • Lack of time is a detrimental factor.
      • Timescales will impact analytic rigour. The less time to plan and implement an intelligence assessment process, the less likely there is to be high levels of rigour.
      • It’s oftentimes thwarted by production pressure, data access, or insufficient training.
      • An increasing demand for rapid analysis, to compete with social media and news outfits, will make analytic rigour harder.
      • The difficulty in constructing and communicating clearly structure argumentation to justify assessment. The time it takes to reason carefully and clearer, and to communicate arguments precisely.
      • Rigor can be harmed by organizational emphasis on speed or production volume rather than quality or fidelity to process/best practice.
      • Most people would say speed is the enemy of rigour, but I would argue an anti-intellectual environment has a greater impact on efforts to improve rigour.
      • Time allowed, quality of data inputs, quality of each analyst capabilities, number of strongly held opposing views, structured repeatable process with tools that aid individuals and the process.
      • Time pressures seem to decrease the perceived need for analytic rigour, as does the newspaper mentality that some groups use– wanting to get a “scoop.”
      • Time
      • Time demands on analysts – the need to respond to tasking quickly – is the biggest constraint on analytic rigour
      • Deception, misinformation, and disinformation are huge challenges to analysts today. Need to be very knowledgeable in vetting reliability of sources, validity of information, and impact on judgments.
      • Some sources are more trustworthy than others based upon the date in relation to a disrupting event, where early reports contain details which are lost in later reports, but which could be inaccurate.
      • Failing to properly validate information via three sources will likely lead to inaccurate and ineffective reporting; not understanding the requirement and not answering the so what question is a probl
      • For each judgement, analysts should explicitly describe underlying sources and their quality (i.e., probability a given source is accurate).
      • Undervaluing and underutilising open source material can undermine analytic rigour.
      • Use of structured techniques for slowing down and externalising thinking;
      • Using a proven structured approach will significantly improve analytic rigor by helping the analyst engage System 2 thinking, encourage transparency, and promote teamwork.
      • I’m a methodologist and attracted to tools such as structured analytic tools, but I admit there is limited evidence of their effectiveness. Bacon spoke of the importance of ‘helps.’
      • Process is a check on intuition. Intuition is rapid, and should be relied upon more in a time constrained environment, but the process backchecks intuition’s assessment and raises flags if necessary.
      • Methodological approach: a rigorous analytical methodology is more than the use of single SATs. SATs must be put into a larger structured methodological framework in order to increase analytic rigor.
      • Most detrimental is the notion that use of standard techniques, such as SATs, solve all methodological challenges.
      • Awareness of the existence of analytic bias does little to decrease analytic bias. But we may jump too quickly to SATs as ‘the answer’, when the evidence is shaky.
      • Most of the structured analytic techniques taught haven’t been scientifically tested to determine their effectiveness.
      • For an example of SATs in a larger methodological framework, see Lars C. Borg Improving Intelligence Analysis: Harnessing Intuition and Reducing Biases by Means of Structured Methodology, The International Journal of Intelligence, Security, and Public Affairs, 19:1, 2-22, http://dx.doi.org/10.1080/23800992.2017.1289747
      • Deconstruction and consideration of alternatives. Effective structured critique to analyse competing hypothesis, consider alternative explanations and identify key uncertainties.
      • Broad knowledge and reading or “outside-in thinking” helps see the topic from different perspectives.
      • Discussion/exposure to external bodies, such as private industry and academics, can assist as that is genuine expert input and is outside the analytic field, challenging our argument styles entirely.
      • Diversity of thought – inputs from a variety of perspectives (collaboration) and exposure of thoughts to challenge will improve analytic rigour
      • Prior analytic lines coupled with a desire to please policy makers, dysfunctional teams, and weak leadership lead to many failures. Clearly not an engineering-like problem regarding rigor.
      • Challenging intradepartment discussion improves rigour. This requires people to separate criticism of work from criticism of person, which can make it harder to enact.
      • Thoroughness: Bayes’ theorem by itself doesn’t require you to look for more evidence. Have you actively tried to kick holes in your theory?
      • Prior analytic or previously published corporate analysis is the most confining issue that needs to be addressed. It is easy to assume tomorrow will be like today, until it isn’t.
      • Objectivity. Inviting challenge, understanding the human tendency for bias and groupthink and taking steps to mitigate them.
      • Dialectical teamwork, e.g. through adversarial collaboration, can improve rigour by limiting tunnel vision and confirmation bias.
      • Over time, rigour will diminish if analysts managers do not effectively challenge them. They will lose respect for their managers and gradually stop producing fully researched and explored assessments
      • Formal organizational requirement to implement tradecraft standards, reinforced by all levels. ACTIVE participation by leadership in analytic review. Fostering of a challenge culture.
      • Time allowed, quality of data inputs, quality of each analyst capabilities, number of strongly held opposing views, structured repeatable process with tools that aid individuals and the process.
      • A factor detrimental to analytic rigour is the generally-accepted assumption that is often cited but rarely questioned. This can take the form of “facts” that are understood to be true.
      • Over time, rigour will diminish if analysts managers do not effectively challenge them. They will lose respect for their managers and gradually stop producing fully researched and explored assessments
      • A poor relationship is one not conducive to truth. A positive relationship which leads to group think is just as bad, perhaps worse, than a poor relationship in which the snr ldr discounts the intel.
      • Diversity of thought. Avoidance of groupthink by including people from a range of perspectives and ensuring effective challenge.
      • It is important for an explanation to shift from the beginning to the end of the analytic process based upon the evidence which is reviewed. Understanding evolves the frame.
      • Clear argumentative structure would most reliably increase analytic rigour and is also what is most detrimental to it.
      • ‘Lazy’ thinking harms analytic rigour. Even a basic approach such as argument mapping can highlight key vulnerabilities and substantially raise analytic rigour.
      • Logic that Pops. Let the consumer do the concluding.
      • The difficulty in constructing and communicating clearly structure argumentation to justify assessment. The time it takes to reason carefully and clearer, and to communicate arguments precisely.
      • In summary, use scientifically tested techniques to structure analysis, use a standard test to measure the critical thinking skills of analysts, require analysts to demonstrate critical thinking in their analysis, and use numerical probabilities in analytic assessments.
      • Transparency of the process behind a product can enhance the willingness to follow a more rigorous process
      • Total transparency in methods used – including, where appropriate “this was my best guess” improves the ability to give meaningful challenge and encourages stronger analysis.
      • Most haven’t tested their critical thinking skills. These skills don’t automatically transfer to their work. Moreover, analysts don’t regularly demonstrate how critical thinking was applied.
      • First and foremost, analysts should provide the most accurate judgements possible. When feasible, judgments should be falsifiable to facilitate systematic monitoring.
      • Oftentimes, the analyst is concerned with protecting their reputation, so their avoid being too precise or clear to leave “room for maneuver” to explain what they really meant when they are surprised.
      • Analytical rigour is improved by using multiple methods or approaches (to corroborate findings).
      • Analytic rigor mirrors the analytical approach pursued, the more comprehensive and “mixed methods” alike, the more rigorous.
      • R- Repeatability – If an analysis effort is not repeatable then how can a decision maker be expected to rely on the upon its products?
      • The ability to triangulate by approaching a question from different angles each of which relies on at least some distinct kinds of information that is originally independently examined.
      • Reliability. Where, if something was analysed by different people or with different techniques, the conclusions would be consistent.
      • Key assets are commitment to helping decision-making, expertise, and use of one or more approaches relevant to the analytical task at hand.
      • Key assets are commitment to helping decision-making, expertise, and use of one or more approaches relevant to the analytical task at hand.
      • Collaboration: is there a culture of collaboration in small teams or is there a culture of single analysts?
      • Analytic rigour is supported by collaboration ideally with a diversity of participants to overcome mind set biases. Over reliance on a single SME can create blind spots and use of analytic short cuts.
      • Groups that allow their members to engage in robust discussion typically do better than individuals in analyzing situations and solving problems.
      • Strategic analysts in particular need to engage with peers & SMEs for an analysis to be valid. But many managers & orgs do not understand how critical this is.
      • Too many analysts, whether contributing to a technique or co-authoring, results in compromise. The average of a wrong and a right judgement is not a better judgement. The balance is hard to strike.
      • More focus needs to be applied to cognition and team behavior to improve analytic performance. Not prediction but the actual day to day products analysts produce.
      • Also critically important is learning basics of probability and calibration of the unknown and uncertain. What I’ve learned is that takes practice–bettor’s odds and giving reasons after “because.”
      • Analysts should jointly communicate their level of uncertainty (i.e., a confidence interval bounding their best estimate).
      • Express analysis in terms of numerical probabilities rather than using ambiguous language or making binary statements. For example, “65% likely this is true” rather than “it is likely this is true.”
      • Poor understanding of how to think about and communicate uncertainty and epistemic probability.
      • Quantifying in the Correct Way. Probability. Bayesian, so that there is an understanding that new information matters.
      • Supporting factor: An environment comfortable with seeing the world in probabilistic terms and with updating thinking in light of new evidence, that is, the environment promotes Bayesian thinking.
      • Confidence Statements although providing a caveat for uncertainty in the source, are not comprehensive enough to ensure academic rigour.
      • Culture: Some analytic techniques (like quantifying estimative probabilities or incorporating output from prediction markets) simply seem “weird” or unusual. That can discourage rigorous analysis.
      • Weight of evidence. Always think in likelihood ratios, or their equivalent: if all theories equally explain the data, it is evidence for *none*.
      • In summary, use scientifically tested techniques to structure analysis, use a standard test to measure the critical thinking skills of analysts, require analysts to demonstrate critical thinking in their analysis, and use numerical probabilities in analytic assessments.
      • “U – Uncertainty in data – uncertainties surrounding data, analyst understanding, and assumptions need to be captured and reflected in analysis.
      • R – Robustness of results – robust results are insensitive to uncertainties. Sensitivity analyses are a way of teasing this out. “
      • Ability to Present. The final product matters.
      • Analysis is only as rigorous as its reporting. Attention shouldn’t be limited to the logical and dialectical qualities of the analysis, but also to the rhetorical aspects of the intelligence product.
      • Narratives can increase persuasiveness as compared to lists, but can actually decrease rigor when they are applied inappropriately and are ‘false narratives’.
      • Feedback (keeping track of hits and misses) increases rigour. Sadly, we rarely have time to keep score or conduct post-mortems.
      • Harm: inconsistent feedback from manager/team.
      • Objective and comprehensive feedback for analysts on their analytical performance, based on clear and comprehensive standards, and backed by appropriate training, can improve rigour.
      • Actively trying to raise your standards through use of feedback loops and self-reflection. Analytic rigour will not improve without effort and conscious thought.
      • Analytic advice is often very condensed as it goes through the hierarchical rank reduction to fit into a small space of time and often the technical detail, such as rigorous processes are not provided
      • Contrary to its purpose, coordination processes have elements that are detrimental to analytic rigour.
      • Political pressure to please senior consumers or decision makers needs to managed by leadership rather than letting analysts figure it out. Many failures of intelligence are political failures.
      • A factor detrimental to analytic rigour is the political consideration, often disguised as “sensitivities” or “diplomacies”.
      • Managers should protect staff from political interference. Unless staff believe their assessments will be published, they will be less likely to make controversial judgements, diluting rigour.
      • Political pressures — either Partisan politicization or pressure from organizational policy. Lack of moral courage by analytic leaders to stand up to pressure.
      • Analytic independence – assessments should be based on the consideration of evidence, not preferences of the customers or politics etc.
      • Detrimental effects include subtle messaging that rigor is “not required” for some policy-related reason. Hands-off attitude by leadership. Toxic culture unable to address biases.
      • Integrity. The ability not to bow to external pressure to skew findings, however this must be balanced with an openmindedness to review conclusions in light of new information or challenge.
      • Opposing factors: association of one’s position on a given issue with an existing group (for instance, my view on issue Y is Z if I associate with X political party)
      • Political pressure can impact the analytical rigour of assessments throughout the intelligence cycle.
      • From experience, a senior boss who values speed and being ‘on message’ ahead of analytic rigour quickly undermines analytic standards.
      • Prior analytic lines coupled with a desire to please policy makers, dysfunctional teams, and weak leadership lead to many failures. Clearly not an engineering-like problem regarding rigor.
      • Expert judgements are inherently subjective, but analysts should refrain from advancing specific policy positions.
      • Externally increasing intelligence communities becoming more politicised almost like quasi policy depts is dangerous for rigour. Internally, IC leaders need to be much more part of promoting rigour.
      • O – Objectivity of process – calls attention to capturing how analyses are undertaken over time. Such analyses can of course be either subjective or objective.
      • Analytical rigour increases when analysts are effectively trained and feel: valued, supported and that their products are worthwhile as they have a meaningful impact.
      • Harm: lack of manager/team support.
      • The wellbeing of staff and their lives outside the workplace can impact their ability to exemplify analytic rigour (and workplace practices should compensate e.g. by multiple staff participating).
      • Analysts must be constantly supported and encouraged to be objective, challenging and curious. Otherwise, they will tend to rely on past assessments and dismiss indicators which contradict them.
      • The analyst must be supported in the cognitive work of managing the analysis, be provided with adequate observabiilty of the analytic processes to support error detection and recovery..
      • A supportive work environment where ideas and approaches are considered based on their merit and not necessarily on who or where they come from increases analytic rigour
      • Underperforming analysts should receive meaningful support – without any initial detriment to annual reports. Otherwise, analysts confidence and sense of purpose will decrease and rigour will drop.
      • Analytical rigour increases when analysts are effectively trained and feel: valued, supported and that their products are worthwhile as they have a meaningful impact.
      • Rigour is undermined when it is not valued and supported by management or clients.
      • A poor understanding of epistemology, and a lack clarity and precision when discussing epistemic justification.
      • Sound theories about how evidence should be best interpreted as well as openness to the possibility that the apparently soundest theory might not apply in a specific instance. Context always matters.
      • Pervasive epistemological misconceptions reduce analytic rigour.
      • Quantifying in the Correct Way. Probability. Bayesian, so that there is an understanding that new information matters.
      • Supporting factor: An environment comfortable with seeing the world in probabilistic terms and with updating thinking in light of new evidence, that is, the environment promotes Bayesian thinking.
      • Sound theories about how evidence should be best interpreted as well as openness to the possibility that the apparently soundest theory might not apply in a specific instance. Context always matters.
      • Supporting factors: Working in a “safe” environment that promotes asking “dumb” questions, reviewing commonly held assumptions, and experimenting with new ideas.
      • An open and inclusive workplace which promotes vigorous discussion is crucial for honest and genuine interrogation of a thought process which in turn is crucial for high analytic rigour.
      • Workplace culture and power dynamics can affect the ability of staff to feel like they can either express divergent views (in a text) or pursue a divergent path and negatively affect analytic rigour.
      • Opposing factors: Need for certainty, culture where changing one’s mind is discouraged, politics
      • Unexamined personal cognitive biases can be problematic but I think organisational group think and unchallenged institutional structures and cultures are more detrimental to rigour.
      • Treating judgments as tentative (hypotheses) and openness tho critiques of these judgments; institutional structures that foster thorough, honest, helpful critiques.
      • Many staff & orgs don’t see the value of identifying uncertainties, specifying intell gaps etc. Investigators can be particularly scathing but are often in charge. They don’t see the value.
      • Rank and your position in the hierarchical structure
      • Good tradecraft, experience and a culture of acceptance and encouragement of rigour.
      • There is a need to create and nurture a mental attitude towards analytical rigour; there is a need to build a ‘culture’ of rigour in an organization.
      • Culture: can either increase or be detrimental. Is there a tick-box culture or qualitative culture? Is quality measured by the number of products made or the content?
      • Some agencies evaluate performance largely on procedural grounds (e.g., using specific SATs) rather than judgmental accuracy. This can incentivize the appearance of rigor, not the real thing.
      • Rigor can be harmed by organizational emphasis on speed or production volume rather than quality or fidelity to process/best practice.
      • Most people would say speed is the enemy of rigour, but I would argue an anti-intellectual environment has a greater impact on efforts to improve rigour.
      • From experience, a senior boss who values speed and being ‘on message’ ahead of analytic rigour quickly undermines analytic standards.
      • So much is still kept close-in in classified world…can be very limiting. OSINT is helping break out of that. Need to be able to separate substance from sources and methods without divulging source.
      • Number of quality skilled eyes on it. Numbered of varied skills and different perspectives on it. Fluidity with which information flows up and down hierarchy and across stovepipes.
      • Lack of access to the range of data resources is a detrimental factor.
      • Visibility of all known relevant items, to all who are involved with something to add; Visibility of alternative possibilities to all who are involved.
      • Work bottlenecks (e.g., waiting on data collection, processing, dissemination)
      • Keyhole access to data or information
      • It’s oftentimes thwarted by production pressure, data access, or insufficient training.
      • The ability to be literate in research methodologies and testing analytical judgements through research and internal/external IC peer review promotes rigour.
      • The lack of individual or collective accountability for poor analysis harms efforts to increase analytical rigour. Identifying good practice, and explicitly rewarding it could increase rigour
      • Seniority, experience or area expertise should not be seen as guarantees for quality. All analytical products should be reviewed and critiqued on an equal basis.
      • Evaluation of analytical processes and outputs against objective standards, carried out by qualified and independent assessors, with best practice recognised and rewarded, and poor practice exposed.
      • Peer review, if done properly, is a key factor increasing analytic rigour
      • Challenges to determining the enhancement or detriment of rigor include defining what the product is and how how it’s quality (result of rigor) is determined.
      • Agencies are not sufficiently resourced to check the accuracy of their assessments overtime
      • The evaluation of analytic rigour at all levels (self-review, peer review, line manager review etc)
      • Rigor can be increased with a forcing function (external review of process or product), or review like a Morbidity and Mortality conference in medicine (see Marrin and Clemente)
      • Formal organizational requirement to implement tradecraft standards, reinforced by all levels. ACTIVE participation by leadership in analytic review. Fostering of a challenge culture.
      • Functioning analytic ombudsman program, both in individual agencies and across IC — led by statutory ODNI ombudsman — can create environment to bring to light hinderances to rigor.
      • Tracking analyst quality is essential. Need to generate through training and events enough evaluations of success/failure in their assessments. Research why someone is more often correct.
      • The incentives for rigourous analysis are often weak. Events play out years after the initial event, there is no individual responsibility.
      • Feedback (keeping track of hits and misses) increases rigour. Sadly, we rarely have time to keep score or conduct post-mortems.
      • The review process within intelligence organisations negatively impacts analytic rigour.
      • decrease rigour; predisposed views, subject bias, agency cultural group think, low tolerance for inaccurate forecasting, no peer/inter agency product reviews
      • Keep score. Tournaments have greatly increased forecasting accuracy, by simply making it visible, and (2) putting reputation in the game.
      • First and foremost, analysts should provide the most accurate judgements possible. When feasible, judgments should be falsifiable to facilitate systematic monitoring.
      • Organisations should ensure interview and assessment process accurately tests the analytical skills needed within their organisation. Accepting poor quality candidates reduce analytic rigour.
      • Per “exceptional thinking,” a core question is whether people with such abilities must be found and recruited, or whether they can be trained. I think training potential is limited.
      • Persons’ personal characteristics are critical. These have been noted by analysts such as Grabo, Tetlock, Whaley, Betts, Soviet active measures personnel, and others. Recruting such people is key.
      • Recruitment is critical, for both analysts & managers – but often done poorly. Critical thinking tests are essential for analysts as some people are just not suited to intell.
      • Analytical rigour is not a static attribute. As security environment changes becomes more complex, ICs need to do more detailed work on future analytical workforce to meet those challenges.
      • Recruitment of well-educated, high-performing individuals increases analytic rigour.
      • Wrong people hired to do the job. In my organisation it is often the job conditions that draw people to the analyst role – or rather the avoidance of other jobs – rather than a passion for intell.
      • Diversity of thought. Avoidance of groupthink by including people from a range of perspectives and ensuring effective challenge.
      • Diversity of thought – inputs from a variety of perspectives (collaboration) and exposure of thoughts to challenge will improve analytic rigour
      • Diversity of perspective conditions the level of analytic rigour.
      • Diverse groups are usually better than less diverse groups in solving problems. It helps if procedures encourage universal participation.
      • Analytic rigour is supported by collaboration ideally with a diversity of participants to overcome mind set biases. Over reliance on a single SME can create blind spots and use of analytic short cuts.
      • Increasing analytic thoroughness requires data, data, and data. Coupled with teams of people who think orthogonally.
      • Research indicates that heterogeneous groups are less susceptible to bias than homogeneous groups. However, heterogeneous groups can be more prone to conflict and more difficult to manage.
      • Number of quality skilled eyes on it. Numbered of varied skills and different perspectives on it. Fluidity with which information flows up and down hierarchy and across stovepipes.
      • Lack of adequate training focused specifically on analytic rigour in intelligence work reduces rigour.
      • It’s oftentimes thwarted by production pressure, data access, or insufficient training.
      • Training: Analysts who are trained to use a broader range of techniques, or who are familiar with a broader range of cognitive biases, are better-positioned to form rigorous judgments.
      • Improve: Clear, consistent guidance. Both on conceptual issues such as accountability (how data is stored) but on larger issues, how the data is used.
      • Teaching methodology improves rigour, if its use is supported and encouraged by management. Otherwise, it dies on the vine.
      • Analytical rigour increases when analysts are effectively trained and feel: valued, supported and that their products are worthwhile as they have a meaningful impact.
      • Rigor is relative to time/resources, but advanced preparation allows for more effective rigor when time constrained environments happen. Iterative training is essential to maximizing time available.
      • In the last decade, agencies have devoted more resources to intelligence training, particularly on critical thinking
      • Experienced editors and editorial staff who can assist with analytical work not just editing is key.
      • The role & function of managers in delivering the right outcomes and outputs from intelligence functions is critical but almost always overlooked. Intell analysts are required to focus on the task. Managers need to focus on people & processes. Many just never understand the distinction or need for change. A fairly specific management style is also required, particularly for strategic teams.
      • Posting Cycle and current time in rank affects analysis
      • Improve and Harm – Role and focus. What is the AR priority ‘of the day’? this will be emphasised.
      • Location of analysis being undertaken can impact on the analysts ability to be “fully functioning’, such as a deployment overseas can introduce analysis fatigue, higher impact of personal influences
      • Wrong people hired to do the job. In my organisation it is often the job conditions that draw people to the analyst role – or rather the avoidance of other jobs – rather than a passion for intell.
      • Analytical rigour is not a static attribute. As security environment changes becomes more complex, ICs need to do more detailed work on future analytical workforce to meet those challenges.
      • Organisation culture plays a big role, especially where the ‘publish or perish’ mantra drives incentive and reward. Resistance to change also is a powerful demotivator for AR.
      • Negative drivers: organizational bias (always there), institutional politics, consequences to career.
      • Organizational protocols and expectations
      • The lack of individual or collective accountability for poor analysis harms efforts to increase analytical rigour. Identifying good practice, and explicitly rewarding it could increase rigour
      • Messaging from intell managers & senior Exec can strongly impact either way. What are staff instructed to do &/or rewarded for?
      • the multi-attribute rigor metric provides one answer; see summary table in pdf briefing Woods/Zelik2013 emailed separately
      • Formal organizational requirement to implement tradecraft standards, reinforced by all levels. ACTIVE participation by leadership in analytic review. Fostering of a challenge culture.
      • Standards can be somewhat useful to provide guidance, common terminology common standards etc. Many agencies sink vast amounts of time here for little impact.
      • Some of the most important problems that analysts must solve have no standards/guidelines(that I am aware of)EG Creating scenarios of a national/ regional situation five years or more hence
      • E.G. No standards/guidelines(that I am aware of)for evaluating key judgments about what adversary’s leaders will do in particular circumstances EG in info-war and cyber-war
      • E.G. No standards/guidelines(that I am aware of)for judging other economists predictions in the 6 month to 5 year future
      • E.G. No standards/guidelines (that I am aware of)for determining phase changes such as from protest to riot to rebellion. Ways of evaluating these are scarce Creativity needed
      • E.G. No standards/guidelines(that I am aware of)for characterizing personalities & psychological profiles & tendencies of leaders & their advisory teams.
      • Expectations of standards of rigor that exist in a vacuum can undermine the integrity of the product.
      • Don’t Clerk the Analyst to Death. Rubrics and standards are fine. Minimize them, because as important as they are to the analyst, they will not be nearly as important to the consumer.
      • Most of the structured analytic techniques taught haven’t been scientifically tested to determine their effectiveness.
      • An exaggerated focus on secrecy re. methods can lead to a risk of overestimating the efficiency of these methods, and can easily lead to stagnation in terms of development of new/revised methods.
      • Such standards and methods or assumptions for evaluating these standards(if they exist) need to be evaluated with serious research projects
      • We do not have enough agreement, nor do we have enough empirical research to accurately state what factors have the greatest effect – positive or negative. We need to pursue this!
      • In summary, use scientifically tested techniques to structure analysis, use a standard test to measure the critical thinking skills of analysts, require analysts to demonstrate critical thinking in their analysis, and use numerical probabilities in analytic assessments.
      • So much is still kept close-in in classified world…can be very limiting. OSINT is helping break out of that. Need to be able to separate substance from sources and methods without divulging source.
      • An exaggerated focus on secrecy re. methods can lead to a risk of overestimating the efficiency of these methods, and can easily lead to stagnation in terms of development of new/revised methods.
      • Secrecy – which of course is necessary in various ways and for various reasons – reduces analytic rigour in intelligence.
      • The analysis has a temporal component that often devalues it due to the complexities of different layers of security.
      • Messaging from intell managers & senior Exec can strongly impact either way. What are staff instructed to do &/or rewarded for?
      • Detrimental effects include subtle messaging that rigor is “not required” for some policy-related reason. Hands-off attitude by leadership. Toxic culture unable to address biases.
      • Externally increasing intelligence communities becoming more politicised almost like quasi policy depts is dangerous for rigour. Internally, IC leaders need to be much more part of promoting rigour.
      • Senior leadership which provides visible support for analytical rigour is key.
      • Leadership within the analytical community, the role modelling of good practice by junior managers, and active encouragement and recognition by senior leaders, can result in greater rigour.
      • Analytical rigor always occurs within an organization, country, & perceived world situation; requires executive to encourage ongoing standards, evaluation, and research, & much collaboration
      • Relationships are critical to analytic rigor. The quality of the relationship between the intel professional and the senior leader impedes or supports rigor.
      • Rigour is undermined when it is not valued and supported by management or clients.
      • Formal organizational requirement to implement tradecraft standards, reinforced by all levels. ACTIVE participation by leadership in analytic review. Fostering of a challenge culture.
      • Teaching methodology improves rigour, if its use is supported and encouraged by management. Otherwise, it dies on the vine.
      • A poor relationship is one not conducive to truth. A positive relationship which leads to group think is just as bad, perhaps worse, than a poor relationship in which the snr ldr discounts the intel.
      • Biases held by supervisors/managers that stymie analytic innovation.
      • A wide range of organisational pathologies can lead to poor analysis, even if one aims to improve analytic rigour.
      • A leader or clients uncertainty about the product they want and their inability to properly frame a question that with drive the analysis and production of an intelligence report.
      • Good direction, prioritisation and collection as well as careful and accurate analysis; use of techniques appropriate to the task and understanding the brief; considering alternative factors and data
      • Distraction of analysts, for example from parts of the ‘analytical picture’ that are not relevant at this particular junction, is detrimental to analytical rigour.
      • other themes/findings: support for identifying conflicts & developing corroboration is missing from almost all technology insertions for analysis; yet supporting conflict/corroboration is central
      • Technological tools (software) can help analysts in maintaining a high level of rigour, either by offering explicit help, or by implicit nudging and a smart ‘design’ of the analytical process.
      • Structured collection processes & consistent communication templates. Modern software tools should be employed and further developed to structure intelligence collection and synthesis.
      • Explicit decision support for self awareness of the analytic process is critical to allow the analyst to achieve ‘right sized rigor’ dynamically, in the context of real world conditions.
      • Implemented Brittlenesses are caused by improper technology insertion (from UI/UX design to back end embedded services) that must be replaced by effective JCS affordances.
      • Time allowed, quality of data inputs, quality of each analyst capabilities, number of strongly held opposing views, structured repeatable process with tools that aid individuals and the process.
      • The idea that computers eliminate human bias is illusory. Programs are created by humans, and humans typically make the decisions, so the human mind is the both the obstacle, solution, and target.
      • The Joint Cognitive System designer (Cognitive System Engineers) must innovate elegant cognitive affordances to compensate for inherent brittlenesses; replace raw technology with Joint Cognitive System affordances
      • Computers can effectively aid analysis, but they require as much research and assessment of biases as humans do. A deep learning algorithm is essentially the digital analog of human intuition. It is not explainable (at least no one yet has come up with effective explainable AI), so is similar in its black box effect as a human mind is when intuiting an answer. Study of analytic rigor must include study of algorithms.
      • Factors that increase: establishing a clear framework or mental model to help analysts understand the goal of the analysis and the techniques for getting there. Analysis-dividing into component parts.
      • Analytic rigour as an outcome is affected by numerous different factors in complex ways.
      • The difficulty in constructing and communicating clearly structure argumentation to justify assessment. The time it takes to reason carefully and clearer, and to communicate arguments precisely.
      • The analysis has a temporal component that often devalues it due to the complexities of different layers of security.
      • Navigating hierarchical, multi-dimensional tradeoff spaces
      • Analytical rigor always occurs within an organization, country, & perceived world situation; requires executive to encourage ongoing standards, evaluation, and research, & much collaboration
      • Latent Brittlenesses of differing types exist throughout the different ‘regions’ of the Joint Cognitive System. Under the proper conditions they ‘break’ to cause flawed decisions, damaging rigor
      • Detrimental – complex issues which are over simplified or not simplified enough.
      • Deception – the possibility of deception makes reasoning abductively extremely difficult.
      • The difficulty in constructing and communicating clearly structure argumentation to justify assessment. The time it takes to reason carefully and clearer, and to communicate arguments precisely.
      • Factors that can improve or decrease rigorous analytic outcomes are tied to the type of analysis, i.e., an all source analysis versus a targeting analysis.
      • The nature of the modern problem space, especially growing complexity, novel ‘historyless’ developments, and ‘grey elephants’ call for holistic perspectives on AR that will be difficult to meet by rules and specifications.
      • Analytical rigour is not a static attribute. As security environment changes becomes more complex, ICs need to do more detailed work on future analytical workforce to meet those challenges.
      • Depending on the nature of the analysis being performed, inherent characteristics of the cognitive work may be the source of latent brittleness, low rigor waiting to emerge.
      • The risk of deception is often underestimated in a Western context, and is also relatively difficult to defend against.
      • Adapting to consistently changing demands and/or requirements
      • external vs internal; external operating environment increases in complexity and uncertainty faster than internal responses/appreciation. Internal agency worldview outdated
      • Organizational structures and cultures that pit intelligence agencies against each other decrease rigour.
      • Availability of data and the analytical processes that you are used to using varies from location and department to department – there are no national standard methods
      • What service you are in including a civilian with or without former para/military service.
      • Customers are often not interested in the analytic rigour that sits behind an assessment – just the key points (thus diminishing the importance of rigour in some analysts’ eyes)
      • Client orientation to intelligence is also important – where do they see value in the intelligence product. Intelligence has no intrinsic value, only the value assigned by clients.
      • Consumers of analysis do not demand sufficient analytical rigour. If decision makers held analysts and their leaders accountable for their output that could drive improvements in analytical rigour.
      • Rigour is undermined when it is not valued and supported by management or clients.
      • hard to discern, but influential; the attention span/intellect/biases/assumed knowledge of the readership
      • Customer desire/preference for greater transparency into how analytic judgments were reached helps encourage analytic rigour while a focus on the “answers only” has the opposite effect

Opportunities to enhance analytic rigour

      • The IC should develop an evidence-based means of reliably appraising and communicating source quality. The Admiralty Code (and spinoffs) appears to undermine communication fidelity.
      • More objective and rigorous evaluation and validation of secret intelligence sources, principally SIGINT and HUMINT. Greater focus on how to rigorously use these sources in finished intelligence.
      • Assessments should use numerical estimative language.
      • Analysis is normally conducted at high security levels and provided at lower levels to a point where it can’t be provided at all. Having defined labels that add confidence to the end use would help
      • Analysts should be encouraged to communicate their estimates numerically.
      • Establish tradecraft standards that promote clear communication of uncertainty with respect to key judgments, whether through numeric percentages or clearly-defined verbal terms.
      • A more complete analytic feedback loop (e.g., what were the downstream outcomes of my analysis and how does that impact my next analytic endeavor?)
      • Keep track of degrees of success and provide feedback as soon as feasible to analysts and teams about their success. Separate this process from salary and similar decisions to encourage candor.
      • Peer review can address analytical rigour, tradecraft, tech content, advice provision & be scaled as needed. It builds a more robust & positively constructive culture & effects are sustainable.
      • Reviewing papers against what actually happened. This would need to be carefully done to avoid just rewarding the most conservative- perhaps looking for where bold judgements led to policy action?
      • Peer review will briefly increase production timelines, but decrease within 2-3 mths as the process is embedded and its value understood. A more collaborative analytic culture also results.
      • Enable more efficient and accurate assessment of the intelligence analytic workforce critical thinking capabilities.
      • Lessons learned, actively looking at where we got things wrong and learning from them.
      • Provide opportunities for calibrating one’s thinking, including making predictions and receiving rapid feedback on their validity.
      • Analysis must be effectively challenged on a routine basis. But agencies must find a means to make challenge constructive, without undermining analysts confidence and limiting their imagination.
      • Seek ways to enable a network of ‘inter level’ interactions to overcome deficiencies in the existing hierarchical information flow. Higher-level review doesn’t often improve the analytic product. Rather, it often waters down the analysis to achieve consensus but makes it less precise, timely, and/or relevant to the consumer.
      • Peer review must be implemented as part of the production process for all products & all staff. Training is essential to establish a common approach, expectations etc.
      • All types of intel analysis should have analytical rigour and be challenged for it throughout the process of production.
      • Agencies need to develop a stronger culture of review. There should be a dedicated role within organisations to review previous assessments for accuracy and investigate analytic processes
      • Idea: consider “paired analysis”. Similar to “paired programming” where one coder drives/explains and another watches/questions, and they split. Slower, but better.
      • Enable more effective and timely collaboration across organizational divisions and stovepipes by reducing the compartmentalization of intelligence.
      • Non-obvious enhancements. Better sharing or products within the intelligence communities!
      • We are not cautious enough about avoiding groupthink – should accelerate the slow process of having reviews with other agencies to involve more informal, regular contacts and discussing hypotheses.
      • Improving collection, including collection In dialogue with analysis so that there is an iterative process in which collection informs analysis which informs further collection, etc.
      • More multi-disciplinary teams. Learning from Operational Research, Statistics etc. Also increased joint working with operational intelligence functions to better understand the business needs.
      • Enhancing collaboration and information sharing. A product should incorporate as many diverse perspectives as possible and should not be an individual endeavour.
      • Mandatory trainings or classes are unlikely to be effective. A more collaborative–rather than competitive–organizational culture would dramatically improve rigour.
      • Failure to collaborate across practice areas; we need to break out of the agency tribes we currently operate in; embrace the diversity of the profession and its practitioners; view it as an enterprise
      • Improved communication between intelligence collection and intelligence analysis. Prioritize collection on indicators with highest diagnostic value.
      • To ensure relevancy, analysts should regularly meet with those who consume their product or be embedded with investigative/operational teams.
      • provide greater opportunities for analysts to verbally brief and engage with senior decision makers to improve client/analyst approaches
      • Forcing different agencies to take part in each others peer reviews. This would create some competitiveness and drive up standards.
      • Briefing is Visceral. Nothing drives home the importance of the consumer, than briefing the consumer and dealing with the questions and concerns.
      • Find ways to share experiences and lessons learned across the domains, leveraging professional bodies and recognise that no one agency knows it all or can do it all.
      • Analysts should be encouraged to interact with analysts at other organizations through joint-duty assignments, inter-agency working groups, and professional societies.
      • Mixed ad-hoc teams (possibly multi-agency) created for specific assignments can be an effective and easy way to create heterogeneous groups, but this requires a degree of willingness to cooperate.
      • Investigate best practices in other fields with similar challenges: diagnostics in medicine, argument in law, critical thinking in philosophy, info aggregation and distribution in journalism
      • Establish interagency structures like a “National Forecasting Council,” whose core mission is making rigorous predictions (whereas most agencies currently combine forecasting with other priorities).
      • We need to be better organised, coordinated and more collaborative and work as one in the national interest rather than poking along in silos thinking we are better than the agency alongside of us. One goal we are all working towards without agenda!
      • A cost-effective way to increase analytic rigour would be to institutionalize multi-agency collaboration on analytic products.
      • Developing an approach to improving analytic rigour requires that it be tuned to the unique needs of your particular intelligence organization,whilst drawing on the work of others where it makes sense
      • Learn from other environments. E.g., from crew relationship management, to integrate practices of airline captains to create an effective working relationship with their crew.
      • There is also value in gaining many short term experiences in the analytic process outside one’s domain, to force experienced practitioners to come back to the process and increase self awareness.
      • Exposure of intelligence collection and analysis practices to objective, and expert, external scrutiny to identify best practice (to be championed) and poor practice (to be stamped out).
      • Agencies should systematically evaluate the accuracy of their judgments. This requires ensuring that estimative intelligence is written in a manner that is clear enough to be evaluated.
      • Analysis is normally conducted at high security levels and provided at lower levels to a point where it can’t be provided at all. Having defined labels that add confidence to the end use would help
      • Collation – How analysts structure and organise the information they collect is an indicator of how good their analysis will be. The lack of collation is a clear indicator of poor analysis.
      • Encourage reports with clear and well-structured argumentation that makes the justification of assessments evident to readers.
      • non-obvious. Or maybe completely obvious. Question the demand for consistency over time.
      • I conceptualise AR as a dynamic between a professional ethic, process standards, and product standards. The professional ethic concerns how well analytic tradecraft principles – such as objectivity – are being embraced and applied. Process standards concern how well the evidence is laid out and its reliability exposed and how openly and carefully the basis for any conclusions is presented. Product standards concern how well the products meet the needs of the consumer rather than observance of in-house production conventions.
      • Information which has been labeled outdated or deceptive can be automatically traced to products which relied upon that source and ‘warning labels’ applied to portions of products
      • Enable rapid dynamic changes to analytic products during both their production and post-publication based on the emergence of new information and/or errors detected in critical thinking or process.
      • For products that have the structure of ‘context plus vetted update’, the ‘vetted update’ portion could be pulled out and plotted on timelines to create a timeline of events to review.
      • Improving understanding of analytical confidence so analysts can make some assessment of the evidence base, analytical robustness and sensitivity to new information that their assessment is based on.
      • Explicit focus on clear communication of falsifiable probabilistic judgements. Comprehensive and objective evaluation of judgments for accuracy will provide vital feedback to help analysts improve.
      • A constant emphasis on the importance of separating and scrutinizing assumptions, data and conclusions should be present at all levels.
      • Breaking away from the “wall of text” format typically used in the presentation of analytic products. Explore more engaging ways to present conclusions and supporting data. *See note in comments.*
      • *Note: I completely realize that changing the presentation of results doesn’t necessarily do anything to impact the actual analytic rigor. However, more effective communication could be huge for demonstrating the rigorous processes that are already occurring– thus “increasing” analytic rigor in the eyes of the beholder.
      • “Keep score. This is cheap to implement and pays huge dividends. It may be uncomfortable. Deal.”
      • All finished analytic products should include an ‘intelligence base’ that explains broadly the source reporting, and key intelligence gaps
      • Key gaps include: institutionalising contestability of analysis; greater emphasis on visual representation of analysis (facts, findings, forecasts); and a robust intelligence body of knowledge.
      • Primacy of evaluation should go to whether the assessment is correct or not. To be rigorous but wrong is still not helpful.
      • Great deal of variation in cognitive skills of individuals conducting analysis. How rate? Many different inputs all along contributor chain. Accurate Quality rating needed, individually & aggregate.
      • Extension to point 3 – training, use of SATs, positive reinforcement etc during routine working should raise an individual’s internal baseline so that when under time pressure they are more likely to still be rigourous without having to use set processes and tools. [Sorry, ran out of time to try and condense this down into the character count and didn’t want to lose one of the other points!]
      • We know a good deal about cognitive biases like confirmation bias and the representativeness heuristic. SATs and other procedures should be designed so as to help control for them.
      • Easy to apply ‘principles’ – including reminders like desktop stickers.
      • ‘Analyst Tradecraft’ must be recast from data processing pipelines to decision making against the adversary based reasoning.
      • Thorough, consistent, reliable, accurate analysis w ability to aggregate across many people, & make all clear & explicit; this demands structured automated reasoning tool with all elements transparent
      • Building an internal contestability function, including a designated devil’s advocate, trained methodologist etc to challenge assessments and advise on analytic approaches.
      • Gamification, while hard to employ in this particular context, has shown to lead to improving results ‚Äì as long as there’s no incentive to cheat.
      • Capturing actual analysis requires new tools and methods, and a fresh approach to understanding cognitive tasks.
      • Having different methods that could be applied to allow for different time scales of delivering analysis
      • Ensuring that continuous assessment of threat areas involves analytical techniques, and is almost a daily process. This would also assist our early warning remit.
      • Bring more science to the art. Structured automated tools don’t have to have “the answers”, but can provide analysts with a platform for broader, clearer visibility and more effective reasoning.
      • Structured writing formats. This supports the explicit demonstration of chains of logic from initial information through to an argued final conclusion, while accounting for alternative hypotheses.
      • Embedding the use of structured analytical techniques via training and tradecraft support. All analysts should have access to this function, but should also be empowered to make their own decisions.
      • Analytical rigor can be enhanced through improved methodological approaches, i.e. link morphological analysis with an improved ACH (ACH-SL).
      • Make sure that “red team” and “devil’s advocate” groups are staffed by capable people, are internally respected, and have senior management backing. Make employment on them career enhancing.
      • Re-emphasize warning and deception staffs. They bring creativity and alternative views, which often challenge conventional thinkers, enhancing rigor.
      • Antagonistic methods (i.e. Devil’s Advocate, et al.) are effective tools for improving rigour, but also require a framework for managing conflict and ultimate decisions re. the final product.
      • Methods to help analysts find alternative voices to the corporate analytic line. If reduced to a tool, something like an opposite recommender algorithm.
      • Enhancing collaboration and information sharing. A product should incorporate as many diverse perspectives as possible and should not be an individual endeavour.
      • We are not cautious enough about avoiding groupthink – should accelerate the slow process of having reviews with other agencies to involve more informal, regular contacts and discussing hypotheses.
      • Developing a culture that respects considered thought. Slowing down the pace to enable reconsideration of key assumptions etc.
      • Because intuition comes from deep experience, leaders should find ways to give intelligence professionals many practical, educational, and theoretical experiences in the relevant domain.
      • Despite its clear importance, getting analyst buy-in to improving analytic rigour will be a challenge; it will be seen as an additional overhead
      • Supporting staff wellbeing and ensuring they are well and their needs addressed such that they perform well, including flexible working arrangements, good workplace culture, professional support etc.
      • Ensure robust organizational alignment on goals of increasing rigor and how to achieve it (i.e., will analysts be supported in taking more time to implement new processes for rigorous analysis?).
      • Low hanging fruit: make sure analytic organizations have libraries with real books. Have managers encourage analysts to read the books.
      • “A support system whereby individuals can seek advice and support should they have any ethical concerns, such as being subject to external – or internal – pressure to skew findings.
      • Ensuring that analytical rigour is front and centre of all worked produced.
      • develop curious and investigative mindsets through greater connection with placements in intel collection capabilities and fieldwork
      • Incentives to analysts for engaging in activities that promote rigour are needed. People will only do what they are going to get promoted for doing.
      • Building an effective challenge culture which enables, encourages and rewards the ability to challenge the status quo; this should take a top down approach, focusing on transparency and openness.
      • Culture that encourages AR.
      • The organization should institute an award for “constructive dissent” like the US Department of State.
      • Give every analyst a passport and a travel allowance, and send them for a minimum of 12 months to the countries / regions they will be working on. The cost is negligible relative to the return.
      • Take positive steps to move intelligence practice from an industry or discipline into a profession recognising that the transition into a profession will be uncomfortable for some but is desparately n
      • Billions are spent on collections, while analytic offices are understaffed.
      • Enhancing analytic rigour will be very hard to achieve unless it has the strong commitment of Senior Executives within an organization, the support of middle managers in setting the conditions for effective change, and the buy-in of staff in helping to design and implement particular interventions.
      • The inculcation of a culture in which everyone considers themselves an analyst. This should prevent senior managers foisting responsibility for analytical rigour onto junior staff alone.
      • Senior (eg. SES level) staff rarely have any depth of understandfing of intell work & almost never seek input from their own staff about what improvements that might be made. When misguided requirements are unilaterally installed from above, it impacts on morale, which is often the distinguishing factor b/w high & low performance intell teams. This goes to the question of robust analysis, productivity, staff turnover, and just about everything else in play.
      • Achieving substantial improvements in analytic rigour will require a corporate and systemic approach that recognizes the ways that people, technology and organization interact to generate capability.
      • Provide examples from the top down: “Here’s an example where I (the boss) got it wrong. Here’s what I did about it, here’s how I integrate this learning to avoid the same mistake in the future.”
      • Poor training and education or both the analyst and the leader providing the tasking expose gaps in consistent and consolidated analyst and intelligence leadership training; there is huge opportunity
      • increase training in non verbiage forms of communicating analysts products eg graphic design, where complex concepts are communicated through graphics.
      • Even though imperfect, explanation clarifies thought. Teaching different methods gives analysts a framework for speaking. Discussing the reasons why for an assessment is critical for self-awareness.
      • Especially for tactical military training, exercises must always move past planning into presentation of the execution by the intelligence target, to allow analysts to see how well they anticipated.
      • Provide training online and unclassified. There is nothing inherently classified about rigorous thinking.
      • Training in Psychology. How content is expressed matters. Humans tend to be visual. Let’s do more production that way.
      • Train analysis supervision – not just analysis
      • I’m assuming your core skills are covered. If not, ensure first that you have top domain expertise before focusing on improving the critical thinking. Try for a second opinion. Dunning-Kruger is a constant threat.
      • Before we ask analysts to make the leap to higher-level thinking, we need to ensure that they possess fundamental thinking skills (e.g. the ability to ask good questions).
      • Ensuring analysts are trained in essential skills – communication, analytic methods, critical thinking, data literate, digitally literate (tools) etc
      • Train analysts in basic principles for assessing uncertainty, such as developing base rates, debiasing, and judgmental calibration. Every analyst should be literate in Tetlock-style analysis.
      • Current efforts to train analysts in python etc. to be ‘defacto data scientists’ distracts from the focus on the essential critical thinking skills needed to achieve ‘right sized rigor’
      • “Introduce high-quality, evidence-based, advanced training in key aspects of analytical rigour.
      • whole by requiring analysts complete X hours of continuous education each year.”
      • In addition to building training on rigor into the on-boarding process for new hires, periodic in-service training should be made available using modern methods, webinars, virtual workshops…
      • Teaching staff higher levels of critical thinking skills and requiring that they achieve appropriate (high) standards.
      • Having analysts being trained in the theoretical approaches from non military universities
      • Cost-effective ways would be incentivizing self-directed learning and creation of a professional ethic.
      • Adding a scientific structure that underpins analysis where it recommends qualitative and quantitative approaches to enhance ‘gut feel’ and ‘instinct’
      • Higher-level thinking skills are often taught using outdated theories (e.g. by Heuer and Kahneman). Curricula must reflect the state-of-the-art in cognitive science.
      • As a key indicator: Why are analysts taught Heuer’s ACH Structured Analytic Tradecraft (SAT) yet universally report “no analyst actually does ACH on mission, its impractical in the real world”
      • Design training and workplace practices based on up to date research and collaborate with other institutions and fields to improve.
      • “Foster continuous learning and improvement for intelligence analysts and the organization as a
      • Focus training on different kinds of problems rather than on structured analytic techniques. Use industry continuous improvement methods for evaluation rather than on single causality factor design
      • Improved analytical education and training. Intelligence analysts need to combine subject matter expertise with a shared methodological framework in order to avoid misunderstandings.
      • Agencies could underwrite advanced degree work by analysts.
      • Provide coaching and other sharing mechanisms to help disseminate best practices from top practitioners.
      • Specific and widely available training courses – which take place in all regions of the UK.
      • Peer review must be implemented as part of the production process for all products & all staff. Training is essential to establish a common approach, expectations etc.
      • Training analysts & mangers in exactly how to undertake peer review (specifically for an intell context) is easily the best way forward. But very rarely done as it should be.
      • Groups need training in communication and providing effective feedback. This does not need to be a VC “self criticism” session, but the objective is to enable members to increase trust/truthfulness.
      • Single source analysts need rotations in all source analysis agencies and all source analysts rotations I. Single source environments. Learning the strengths, weaknesses, and tradecraft of each other
      • Exercises in self-reflection at both the individual and group level can add an element of self-awareness of the impact of biases and group dynamics in both the individuals and the group.
      • Improved transparency across scales (i.e., how does my analysis fit into some bigger picture?) and within scales (i.e., how does my analysis complement or contradict their analysis?).
      • Implement active de-biasing techniques, including working in groups / having people check one another’s work.
      • little is available to help with conflict & corroboration, broadening, hypothesis exploration, stance analysis & overcoming tendencies to misread intent of others
      • support for broadening hypothesis exploration is missing from most tools for analysts; see Grayson 2020 though different analytic setting.
      • “Teach analysts to think taxonomically. The more structure they can put into list building, the more rigour they can put into their analysis.
      • Training in Informal Logic. Analysts don’t know the lexicon or the content. Philosophy has a lot to give here.
      • Training on Fallacy. Once we accept that analysis is argument, we must understand how to recognize and mitigate attacks on that argument.
      • Quality of personnel training (big gap) can be vastly improved with new automated cognitive development tools, which also provide cognitive skills assessment capabilities on an ongoing basis.
      • Train analysts on the specifics of good reasoning, e.g. how to reason abductively effectively, not on things that don’t contribute to the AR of written argumentation. (e.g. SATs and biases).
      • Training related to self-awareness and biases
      • More sophisticated training on critical thinking and the scientific method, and how to apply in real-world operations.
      • Training specifically for logic. In my organisation I see good and experienced frontline practitioners come to intel but not do well because they have no specific training in critical thinking.
      • Analyst training and tradecraft practices that reduce the impact of cognitive bias on assessments would have a positive impact.
      • Train consumers to better understand the nature of intelligence.
      • Consumers often do not really understand intelligence analysis, particularly as it moves further beyond nat sec and defence. Customer education to help improve tasking and reception of products.
      • Statistical, data science and probabilities training – even if it can only go as far as knowing one’s limits to avoid Dunning-Kreuger effect.
      • Analysts should receive training to improve their statistical literacy/probabilistic reasoning skills.
      • Training in Probability. There is not enough of it, and this business runs on probabilistic assessments. Probability can amplify and enhance.
      • Low hanging fruit: literacy around data analytics and how the concepts of risk and probability shift with different kinds of data. Train the best to really understand this.
      • Trains analysts on how to discuss and write about the epistemology of intelligence, probabilistic reasonings, and argument structure. Being about to discuss these issues with peers is key.
      • Improve AR by better opportunities to move across the national intelligence enterprise based on enterprise-wide agreed accreditations and certifications, which break down domain boundaries.
      • Mandatory trainings or classes are unlikely to be effective. A more collaborative–rather than competitive–organizational culture would dramatically improve rigour.
      • Incorporate a “rotational” tour of at least two years in length in a methodology or tradecraft cell as part of the professional development of any IC analyst
      • Offer workshops showcasing analytic rigor in other walks of life, private sector, industry, academia.
      • Jointly create and teach a graduate certificate program in “Analytic Rigour” using a mix of IC and academic faculty
      • Establish a “center” or “institute” manned jointly by academics, IC professionals, and others focused on methods, tools, and technologies that can increase transparency and insights
      • In today’s post-COVID world, much can be done with online practice through cases and practicums. What are the principles, what are the thinking techniques, how are they used in realworld cases?
      • Make an effort to key aspects of rigor — e.g. analysis of alternatives — to real-world developments. Use real intelligence products as training aids where possible.
      • Design multiple case studies with embedded analytic shortcomings and incorporate into the existing IC and ODNI schoolhouse analytic training programs
      • There needs to be greater awareness – perhaps through training – of how a lack of rigour has underpinned key intelligence failures in the past. New analysts are not students of intelligence history.
      • Perhaps it is a truism to say that we must learn from past mistakes, but there is value in that idea. However, to learn from the past one must first know that past.
      • Scenario-based exercises that strengthen analysts’ skills in applying classroom learning about analysis principles
      • adopt scenario based and immersive training where student analysts in syndicates explore complexity and uncertainty through real world scenarios
      • Increasing diversity within analytical teams and organisations in general. Especially given the interpretative difficulties of culturally laden behaviours.
      • Manage team construction with actual and reliable psychological testing rather than instruments like Myers-Briggs or other pop psychology tests with no measures.
      • Analysts with specialized expertise could be provided the opportunity to join a team making a product without being explicitly asked to join by supporting maps of expertise in an organization
      • Multi-discipline analysis will benefit from having methodological experts available to enhance analytical production.
      • Facilitating non-hierarchical group processes.
      • Intentional efforts to increase diversity among analyst teams
      • use multi-attribute rigor metric to visualize/monitor trends in individual/unit work; continuing evaluation, feedback & tuning to balance sufficient rigor relative to resource/time constraints
      • Hold analysts/supervisers to account – if predictive assessments differ from reality, determine if this was because insufficient rigour was applied and address (in a nice, constructive way!)
      • PHIA audits of different agencies to check they meet its standards – this could operate in a similar fashion to Ofsted with short-notice visits to prevent organisations hiding problems.
      • The organization should hire an analytic methodologist to review work products, facilitate the use of structured analytic techniques, and individually coach analysts.
      • Gaps: interviews/analysis of rank-and-file intelligence officers
      • Steps should be taken to dismantle the IC’s culture of blame avoidance. Accountability pressures give rise to murky operating procedures that undermine the IC’s decision support function.
      • Intelligence leaders and Commanders need insights into analytic process (i.e. measures of analytic rigor) to interpret trustworthiness of intelligence assessments.
      • Quality of analysis should be incentivised above quantity and presentation, with informal and formal recognition and reward for accuracy and impact. Difficult with gov wide systems.
      • Explicitly establish mechanisms to reward and positively reinforce analytical rigour, such as financial inducements, awards, honours, promotions, development opportunities and desirable postings.
      • A greater focus on quality than quantity (e.g., a KPI for many analysis units is volume of reports)
      • A key non-obvious way would be the development and wide application of key performance indicators (KPI) that reward collaboration, novel approaches, and client satisfaction over production volume.
      • Reward thoroughness and explanation
      • Analysts should be supervised and managed by senior analysts who can support and critique them, not Type-A intelligence officers or investigators.
      • Think more carefully about the managers you’re installing to lead intelligence teams. They should have strong people skills, among other things. Staff engagement is critical to results.
      • Seniors/supervisers need to be more aware of the importance of analytic rigour, and recognise and protect it better. Lack of skills/experience, time etc mean many cannot support junior staff.
      • Intelligence managers need to take a greater interest in, and responsibility for, the analytic processes of their staff. Many don’t know how to probe rigour
      • Selection criteria for analysts should include skills in understanding and executing research methods, as rigor should have an element of methods used to evaluate research.
      • When doing psychological testing of applicants, test for possession of curiosity, creativity, and “exceptional thinking traits,” not just for sanity and personality types.
      • Instead of lowering standards to attract as many students as possible, academic programs that train future intelligence professionals should raise standards to only select the most capable individuals
      • Introduce a test/assessment element to recruitment campaigns which exposes the level to which an individual is naturally rigourous. Less feasible given staff shortages and government restrictions.
      • Investing heavily in recruiting subject matter and analytical expertise, as well as in modern training and education programmes to develop new and existing staff.
      • Classic Interventions: install AI/ML, train harder, recruit smarter talent, enforce procedures fail to address the underlying cognitive brittlenesses placing rigor at risk.
      • Hire people with genuine expertise. The term “generalist expertise” is oxymoronic and unhelpful.
      • At IC and agency level, better workforce planning including more diverse recruitment and retaining subject matter experts.
      • Diversifying recruitment in terms of gender, cultural background, prior experience and study, types of prior experience.
      • use multi-attribute rigor metric to visualize/monitor trends in individual/unit work; continuing evaluation, feedback & tuning to balance sufficient rigor relative to resource/time constraints
      • I conceptualise AR as a dynamic between a professional ethic, process standards, and product standards. The professional ethic concerns how well analytic tradecraft principles – such as objectivity – are being embraced and applied. Process standards concern how well the evidence is laid out and its reliability exposed and how openly and carefully the basis for any conclusions is presented. Product standards concern how well the products meet the needs of the consumer rather than observance of in-house production conventions.
      • What are the observable measures or indicators of success for you? A grounded assessment approach should be established in advance of implementation.
      • Establish tradecraft standards that promote clear communication of uncertainty with respect to key judgments, whether through numeric percentages or clearly-defined verbal terms.
      • Implement measures to augment natural ability (training/guides, tools, checklists etc), without overburdening and possibly disguised to reduce resistance by people who think they are already rigourous
      • “Standards” and checklists have their uses. Checklists for combat units help them function and not miss essentials when they are working in a time constrained environment, with little to no sleep, high stress, and danger. Sometimes the checklist is literally substituting for thought because the user’s cognitive function is too degraded by external factors to make effective decisions. Bureaucracies should implement standards, yet leaders must understand that the least common denominator that standards/checklists can become a substitute for effective thinking even when effective thinking is possible. Too many standards can dilute focus on the essentials of creating self aware individuals who have deep experience of the domain and maintain relationships which maximize trust and truthfulness to allow for presentation of analysis to make decisions. People accept that computers using ML must undergo ‘training” with thousands of iterations on large data sets to become useful, yet do not commit to an equivalent amount of intelligence training.
      • To begin a discussion about rigour (and get early buy-in on changes needed) ask analysts to modify the CIA-DI matrix on effective analysis to fit their reality.
      • Analyst training and tradecraft practices that reduce the impact of cognitive bias on assessments would have a positive impact.
      • Ensure that tradecraft standards reflect best practice rather than aspirational goals. Distinguish process standards from product standards, and focus efforts to improve rigor on process.
      • The sustained development of IC wide standards on professionalisation that reviewed analytical standards, professional development and accreditation.
      • Common standards also help – or at the very least – common appreciation of what is meant by rigour.
      • Joint international development of tradecraft practices. We all have similar tradecraft issues, and in 5 eyes broadly similar intelligence traditions. Share resources to address common issues.
      • Clear and accessible guidelines/protocols. Consistent regardless of role.
      • Collect and analyze exemplars of analytic rigour as well as those assessments with significant shortcomings
      • A standard definition of what is analytic rigour and intelligence analysis is in different domains and in the tactical, operational and strategic environments
      • To mind, no Australian analytic intelligence agency has declared analytic standards; these should be developed (using five eyes ones as a base) to ensure a common understanding of rigour
      • Lack of standards agreed across operating domains holds us back but is a great opportunity for us to align analysis across the intelligence profession; significant opportunity exists for international
      • Cost effective ways – something like PHIA – the Professional Head of Intelligence Assessment who try to make analysis standardised throughout the UK community.
      • Consistent standards for all assurers which are effectively taught and independently policed to ensure they are correctly applied.
      • “Resist the siren song of standards. Good but insufficient. Pilots believe in checklists *properly used*. It’s their skin, so they tend to be proper. Guard against box-ticking.”
      • Analytical bodies must be crystal clear with their analysts, and with users of analysis, what their expectations are of rigorous analysis.
      • Introduce a more rigorous approach to the evaluation of analytic products for analytic rigour.
      • Use a clear and effective method of evaluating analytic rigour.
      • Introduce practices, or transform existing practices, in line with the best available evidence as to how they affect analytic rigour.
      • Analysts could describe what they would have done to increase rigor if they had had one more day/week to do more analysis, and customers could agree to extend deadlines to support this
      • Provide longer (any?) time for analysts to explain their reasoning
      • Stepping away from current ‘pen-and-paper’ approaches to software-supported workflows (with dedicated software tools, not just Microsoft Word as is now often the case).
      • Analysts’ tools need to change from data-centric frames of reference to explicit ‘analysis object’ based environments.
      • Leverage low-cost technology to empower the systematic application of critical thinking by intelligence analysts in their work activities. This will ensure critical thinking transfers to their work.
      • Effective exploitation of new analytic tools and techniques almost invariably requires re-imagining how practice is undertaken to realise the full benefits of the investment.
      • Use simple Word / Excel tables for routine analytic tasks. You do not need sophisticated software to do good analysis.
      • Any efforts to exploit emerging commercial analytic tools and techniques should inherently consider how to integrate them within a particular practice context
      • This systemic or capability view points to the need for interventions at individual, team and organizational levels, with technology almost invariably playing a role at each layer.
      • Providing a representation of the information reviewed, highlighted, included, and not included during the analytic process from an automated data capture could increase transparency
      • Tools to document process (i.e. audit trails)
      • Tools to afford analysis supervision
      • Software that helps analysts to actually use structured techniques (by either explicitly mandating it, or by implicitly nudging the analysts to work in a way that adheres to the procedures).
      • Bring computer tools into the critical thinking process, enabling total visibility of all aspects of facts, analysis, reasoning, assumptions, etc, enabling more effective collaboration and critique.
      • Collaborative tools to enable cooperative approaches to analysis, including supporting cross functional teams.
      • Crowd sourced reasoning 🙂
      • Aggregate: where pairing, Delphi, or discussion are infeasible, consider mechanical aggregation where appropriate: e.g. surveys or markets for forecasts.
      • automated tools for analysis provide A valuable data source (not THE answer) but do not support integrated analytic process in terms of sensemaking under uncertainty, noise and limited resources
      • we have shown can automatically monitor analytic activity in background to extract indicators of rigor & tailor feedback to address detected weaknesses
      • I’m concerned about analytic rigor in security decisions ranging from personnel vetting to technical threat and risk assessment. In this case, many are resistant to using even fundamental analytic
      • COMMENT: (continued) concepts such as base rates. Machines can help with some of the rigor (math), but humans need to learn to use the insights without being cowed by them. Confidence but not overconfidence.
      • Automated tools can help analysts discover and make use of prior / historical but related insights and patterns.
      • Automated tools can facilitate collaboration across stove pipes and throughout the vertical hierarchy, enabling more fluid and timely sharing and discovery.
      • Automation of some analytic functions, such as sense making. This will enable the incorporation of much larger volumes of information in the analytic process.
      • Automated mining of relevant materials in previous intelligence reports within the organisation (or beyond).
      • The use (& further development) of software tools to assist with the collection, collation & structured output of information/intelligence. Sloppy buckets of data/information impedes holistic analysis
      • install AI/ML
      • Better tools that are more closely aligned to the work of the joint cognitive system (i.e., humans, “machines” working harmoniously against some goals in some context)
      • Data analytics, and leveraging ‘big data’ often talked about as if it’s the future of analysis. Don’t think data analytics alone will improve analytic rigour though. Unless there is a conceptual framework that supports it’s use, it will lower rigour but provide false confidence.
      • Big gap in analytic method: how to incorporate results of artificial intelligence (that does not explain how it gets its answers) into final products & how to explain this to customers
      • Outsource research projects about process to academia. Untapped potential in collaborations between universities and government organizations.
      • Enhance analysts’ opportunities for meeting, discussing issues with outside experts. Security to often is an unhelpful barrier to such activities.
      • Linking academic research to practice. Practitioners rarely have access or the time to keep up to date with relevant academic research.
      • IC analysts need greater exposure to external researchers when writing their products. CSIS in Canada has a good academic outreach program currently Australia doesnt
      • Investigate limitations of scientific method and adaptations of social science methodology as embedded in SATs and other principles of tradecraft. Adapt processes to those limitations.
      • Trusted academics could become visiting IC analytical and research staff and mentor to junior analysts.
      • The IC engaging more with researchers on analytical practice problems not just STEM but also SBS research areas.
      • Embrace science.
      • An effective organization will continuously and rigorously study itself and what makes for its successes and failures. Organizations have an advantage in that they can if they wish conduct rigorous internally-focused experiments. Self study, however done, is usually best done through researchers intimately familiar with the organization and its task but not part of it and not subject to any kinds of discipline through it, though if an organizational office has substantial guaranteed structural independence, this can work as well. I have observed in government that when studies like this happen the organization studied is often more concerned with the rating it receives and whether its activities are perceived a success than with the quality of the research done or even its possible utility in improving organizational processes. The organization doing the study is often only too happy to oblige because its next contract depends on the studied organization’s being pleased with its earlier results. Steps must be taken to counter these drawbacks.
      • Make the study of analysis more rigorous and demanding. We have to raise the bar academically. Failing to do so makes us complicit in the intelligence failures that will follow (there will be many).
      • Incentivize innovation around AR through enterprise-wide idea management schemes and ‘hack-a-thons’.
      • Regular engagement with organisations developing new methods (i.e. IARPA) to ensure the methods used to ensure rigour are continually tested in the most enjoyable and interesting way possible.
      • Reconsider how the organisation works, conducts analysis and trains and redesign to align with best available research, including scheduling future opportunities for rethinking and updating.
      • Until we accurately define rigor and determine ways to evaluate it -in progress and at completion of a product – we cannot honestly say what would definitely improve rigor in intel analysis
      • Studying the line between subject matter expertise and analytical talent , to identify good analysis as opposed to knowing what the opposition are likely to do from experience.
      • More research is needed to establish whether and, if yes, then how the implementation of analytic standards affects experts’ ability to assess complex issues holistically and intuitively.
      • Test any interventions on ecologically valid tasks – such as realistic intelligence problems.
      • Divide the problems analysts address into taxonomies & focus on how different problems are addressed by different reasoning by different analysts with different perspectives (or worldviews)
      • Investigate limitations of scientific method and adaptations of social science methodology as embedded in SATs and other principles of tradecraft. Adapt processes to those limitations.
      • Explore prediction markets and other large group techniques for assessing information. Find ways to make them interesting and incentivize their use.
      • Look at actuarial studies – how do they work with probabilities? How do they estimate the likelihood of events that have never occurred before?
      • Draw lessons from qualitative research practices on how to demonstrate reliability and validity in the meaning-making process when considering complex data.
      • Incorporate new research and methods on wicked problems and so-called ‚Äúsocial messes.‚Äù
      • Practices within intelligence organisations which seem to promote analytic rigour might not do so (e.g. SATs). Practices need to be rigourously evaluated.
      • Research into whether SATs actually improve analytic outcomes. If not, we need to find another approach.
      • Do research on tradecraft best practices. Evaluate SATs. Publish results. Use results to inform practice and embed into training program (ie. learning organization)
      • Specific components of rigour (e.g., analytic techniques) can and should be empirically tested. Analysts shouldn’t be required to use techniques just because they sound good or make intuitive sense.
      • More research is needed to establish whether the implementation of analytic standards increases the quality of analysis. More research is needed to identify the factors that drive analytic successes.
      • Conduct large-scale experiments of which structured analytic techniques work better than others. At present, we have too little information about the empirical efficacy of various analytic methods.
      • Investigate limitations of scientific method and adaptations of social science methodology as embedded in SATs and other principles of tradecraft. Adapt processes to those limitations.
      • Develop and rigorously assess for usability and IC receptiveness more effective SATs. In doing so recognize that one size may not fit all; empirically examine SAT and individual analyst fits.
      • Aggregate: where pairing, Delphi, or discussion are infeasible, consider mechanical aggregation where appropriate: e.g. surveys or markets for forecasts.
      • new hyper-challenge is disinformation campaigns both intentional & emergent. Recognizing disinformation, early vs late recognition, modeling campaign, weak/slow/stale interdiction, trade-offs
      • Tend to focus so much on the INT that miss the big picture.
      • Start doing research on info-war and cyber-war domains
      • What kind of impact(s) are you prepared to accept in order to improve? (eg. cost, speed, disruption, training time etc.)
      • I hope to be more helpful next week.”
      • I’d argue Low-probability high impact scenarios or wildcard scenarios. Very unlikely to happen until Brexit or Trump come into power!
      • “As above, given the extensive experience of others on the panel on related subjects and my relative lack thereof I’m sure that any contribution I could make to this phase of the study would rate as obvious.
      • If methods are to complement each other, they must rest on compatible assumptions. If, on the other hand, they are to provide contesting views, assumptions can be divergent.
      • Forget the term rigor. I worked at NASA as well as CIA and now LAS and analytic problems aren’t not engineering rigor problems. It’s like calling an electrician to perform heart surgery.
      • Going for low hanging fruit seems like a very bad idea. This is a complex, holistic construct and attempting to solve it in a cost effective way could possibly have an opposite effect.
      • findingQED can bring more science to the art of intelligence analysis: for better training, effective personnel assessment and more consistent thorough accurate and reliable operations. Look forward to exploring a bit with you.
      • Starting from the recognition that 2020 is not 1960 would be good. The foundational paradigm of intelligence isn’t sacred, but exists in context. What does the 21st century need? Get ready to rethink
      • acknowledge the “dialogue of the deaf” between logic and reasoning trained analysts and experiential learned seniors and balanced “on the fence’ products do not “move seniors to a decision”
      • Avoid the temptation or allure of single point solutions that promise dramatic enhancements to analytic rigour. This training approach, that advanced analytic tool etc.
      • COMMENT: Experimental type approaches have much to offer; specifically committing to what information should be found if a theory is correct and/or conducting interventions designed to reveal if a theory is correct. E.g. In WMD miscall possible locations where WMDs might be found were specified and inspected with no WMD trace. Each time this occurred the estimated likelihood that Sadaam had WMDs should have been reduced. Instead confirmation bias seemed to have led some not to realize that absence of evidence which could be expected to exist if a theory were true (Sadaam has WMDs) was in some measure evidence of absence. How strong the evidence was is a different question and depends on how certain it is that the evidence would have been found if Sadaam had WMDs.