Most people do some form of analysis on a regular basis throughout their lives – when they buy a house, a car, or go on a holiday, for example. Every business also does some form of analysis, whether it is looking at productivity improvements, benchmarking, cost savings, or using the balanced scorecard.

However, change is passing by every business that is focusing internally and on information derived from internal data. Strategic and industry risk to any organization come from the external environment. Despite this, many organizations and their decision-makers today are investing a vast amount of resources in attempts to obtain the solutions to their (external) market and industry challenges via enhanced information technology (IT) and information systems capabilities. Management guru Peter Drucker (1997) has noted that managers have come to rely too heavily on computerization and systems and that this fails the manager when they cannot gather the necessary data in the first place. He further notes that a large number of executives spend all their time with data that is both internal and incomplete. Drucker concludes the information that executives need the most is about the “outside world” and that most important decisions should be focused on data gathered about what is going on externally to, rather than inside, the company. (Drucker, 1997).

With this prevalent internal focus, it is unfortunate that so few executives are delivered the right intelligence to enhance their decision-making and to assist them with managing industry and market risk, the primary bases of competitive intelligence focus (Hammonds, 2001). No matter how many Customer Relationship Management (CRM), Knowledge Management (KM), or Business Intelligence (BI) computer systems an organization implements and pays for, they are not going to dramatically improve its competitiveness. Companies need to focus on the external aspects of their environment if they are to succeed today and in the future. Customer learning is clearly important but equally important is competitor learning that comes through competitor analysis (Fahey, 1998). Good strategies come from making effective choices (Porter, 1996) about what both the external environment and the internal organization can tell the decision-maker.

Numerous strategy scholars have noted that organizations have to constantly reinvent and reposition themselves in order to stay ahead of the competition, and in many instances they have to do this just to stay in the game (D’Aveni, 1994; Hamel, 1996; Hamel & Pralahad, 1994). Of the 500 companies making up the S&P 500 in 1957, only 74 remained on the list in 1997. Of the original Dow Dozen in 1896 only one remains – General Electric. All the others have fallen aside, been absorbed, or been unable to compete.

Competitive intelligence (CI) is defined for the purposes of this chapter as a systematic process for gathering and analyzing information to derive insights about the competitive environment and business trends in order to further the organization’s business goals. It is about managing the opportunities and risks in the competitive battle and delivering to decision-makers the capacity to act. The opportunities and risks today are many, and include the increasing pace of business, information overload, increasing global competition from new competitors, more aggressive existing competition, massive political effects, and rapid technological change, among other things.

Every important business decision entails opportunity or risk. So how are strategies formulated and how do firms ensure that the chosen strategy is the right one? The answer – it is only through the careful collection, examination, and evaluation of the facts that appropriate strategic alternatives can be weighed in light of organizational resources and requirements.

Every good manager recognizes the need for systematic analysis of his or her competitors and the external environment. Analysis has been described as an obvious weak link in many public and private intelligence programs (Werther, 2000). Compounding the matter, as Michael Porter noted in a recent article, is the fact that so few managers actually receive analyzed information for their decision-making or even have a strategy (Hammonds, 2001). Why?

Called by one expert (Herring, 1998) the “brain” of a modern CI system, analysis is one of the more difficult roles that a CI specialist is called upon to perform and that a manager is called upon to oversee. The brain requires a good flow of oxygenated blood, or, in the case of CI, accurate and reliable data flow. The brain is a muscle, and like all muscles, it requires constant exercise to be fully effective. This exercise comes in the form of deep and regular thinking, which results in and causes enhanced learning. What does this all mean for the job of an analyst?

The job of an intelligence analyst is to protect and enhance his or her company’s competitive market interests by providing useful and high-quality analysis to decision-makers, policymakers, and resource allocators (otherwise known as their “clients”). Analysis is given to these clients in the form of analysis process outputs such as assessments, briefs, bulletins, charts, conclusions, estimates, forecasts, issue reports, maps, premonitory reports, profiles, recommendations, and/or warnings. These analytical products are the most tangible manifestations of the outcomes of the analytical process.

So what is analysis? Analysis involves a variety of scientific and non-scientific techniques to create insights or inferences from data or information. For the purposes of this chapter, the working definition given previously suggests that analysis is the multifaceted combination of processes by which collected information are systematically interpreted to create intelligence findings and recommendations for actions. Analysis answers that critical “so what?” question about the information gathered, and brings insight to bear directly on the decision maker’s needs, helping the client to make enlightened decisions. It is therefore both a process and a product (Fleisher, 2001).

What purposes does analysis serve? In his influential 1980 book Competitive Strategy, Michael Porter asserted the need for sophisticated competitor analysis in organizations, and subsequently the need for an organized and systematized mechanism – some sort of competitor intelligence system – to make the process efficient (Porter, 1980). Most managers in today’s competitive environments implicitly or explicitly recognize the need for more systematic analysis of their competitors, competition, and competitive landscape. However, recognizing that there is a need for the capability and putting into place the systems, structures, and skills needed to exploit the capability are very different things. Numerous researchers through the years have identified enduring gaps between what is viewed as being needed for decision-making in organizations (i.e., expectations) and what is actually being delivered by organizational competitor analysis systems (i.e., performance) (Ghoshal & Westney, 1991).

Langley (1995) notes that the analysis process serves intermediate decision-making purposes such as reducing the number of input variables, providing more time for decisionmaking as opposed to facts absorption, providing connections among seemingly unrelated data and information, providing a context by relating information to organizational mission, objectives and strategy, and creating a “working hypothesis” by making a story out of disparate business environment information.

Analysis usually takes place at multiple levels within an organization. Strategic analysis is arguably the most vital form of intelligence because it provides a framework within which other forms of intelligence collection and analysis take place, offers an overall assessment from the top down rather than from the bottom up, and helps to provide a basis for policy formulation, resource allocation, and strategy development. Tactical analysis is a necessary and important complement to work done at the strategic level. It is the natural linking element between macrolevel analysis and the micro-level focus on individual cases. Operational intelligence analysis overlaps with investigation and is often single-case oriented. It involves technological assessments of the methods used for marketplace battles, specific investigations of competitive threats, and the like. An important component of operational analysis is identifying the particular vulnerability or vulnerabilities that have been exploited and providing guidance on how it or they can be minimized or eliminated.

Each of these analytical levels requires a direction or focus, a methodology, and some experience. To simply try to answer “tell me what you know” leaves one at a loss as to how to satisfy a manager’s requirements. Similarly, “tell me everything about x” does little to support good analysis or an executive’s decision-making process. Poor analysis will in turn provide little room for quality decision-making.


Think about how many times an executive has been heard stating one of the following:

“I already know everything that needs knowing in this industry.”

“We are the biggest and the strongest. The competition can’t touch us.”

“We can’t afford expenses generated by cost centers like CI during these difficult times.”

“We do CI or analysis already and have just installed appropriate software.”

“It is crazy to think that someone in a corner office can tell me how to run my business.”

The following are some of the more prevalent reasons that suggest why analysis is not managed properly (Fleisher & Bensoussan, 2000):

Tool rut. Like the man who has a hammer and thinks everything he sees begins to look like a nail, people keep using the same tools over and over again. This tendency to overuse the same tools can be described as being in the “tool rut.” This is counter to the principle that in addressing the complexity of this ever-changing world, the CI analyst needs to look at numerous models to provide value.

Business school recipe. Many individuals charged with doing analysis come out of MBA programs where they have been offered tried-and-true recipes from instructors with financial and management accounting backgrounds. Business and competitive analysis can be as far from accounting analysis as strategy is from accounting. This may help explain why few accountants lead CI functions (or organizations in general) and vice versa.

Ratio blinders. Most businesspeople perform analysis based on historical data and financial ratios. This can at best only provide comparison and tell the analyst the size of the gap (the “what”) between two organizations on a particular data point or data set. It does not help the analyst explain the reasons for why the gap exists or how to close it.

Convenience shopping. Individuals frequently do analysis on the basis of the data they happen to have as opposed to the data they should have. Because the analyst has certain data at his or her disposal, he or she uses the analytical technique that suits the data rather than focussing the analysis on the client’s question and/or the intelligence actually required. This is especially true when accountants are asked to do analysis and they provide outputs that only reflect financial manipulations.

To further comprehend the lack of effective analysis, the discussion must go a little deeper to clearly understand the four key areas that impact the quality of analysis. These are the analysts themselves, the analysis task, the internal organizational environment, and the external environment.


At this level, it is important to understand what it is about the individual conducting the analysis that constrains the production of effective analysis. Based on our experience and observations, we have found that CI practitioners suggest the following six items as the most common reasons associated with ineffective analytical outcomes:

People are born with different abilities to perform analysis – some have inclinations or proclivities toward providing good analysis and lateral thinking, such as the “NTs” from the Myers Briggs Type Indicator. Not all the individuals assigned to perform analytical tasks have the ability to consistently produce effective outputs. Werther (2001) has spoken about the critical need to hire CI practitioners with desirable characteristics. These characteristics are especially important for analysts.

The brain and people’s mental capacities can only perform so well – research has indicated, for example, that the human mind can only handle seven individual chunks of information (plus or minus two) simultaneously (Miller, 1956). People rely on a limited set of mental models, have preconceptions on issues, and exhibit a wide range of cognitive bias when reviewing information. People also think differently – some in a linear way (i.e., right-brain thinking), others laterally (i.e., left-brain thinking). This is important when viewed in light of analysis being a mixture of both scientific and non-scientific techniques.

Analytical capability requires innate capabilities, training, development, and experience – practice helps in analysis, and the more experience one gets in performing good analysis, the better he or she becomes. The basis of analysis is thinking, and unfortunately there are few higher education courses in business or management specifically directed towards this process of thinking. Good analysts can leverage their experience, which gives them more time to actually undertake the task.

People often don’t like to analyze. It is hard work, and as we pointed out previously in the “Tool Rut” symptom, people often seem to prefer the path of least resistance. It is often easier to pass along undigested information or to suggest that categorization of data (instead of synthesis and dis-synthesis, induction and deduction) is good enough.

Analysts are not always the “front-line” visible organizational personnel – they work in the “shadows,” at their desks, and behind their computer screens. An important aspect of analysis is the need for analysts to interact and network with others (particularly their clients and those providing them with the raw data or information) in order to understand some of the subtle nuances that may occur in the collected information or in the required output.

Everybody thinks he or she can do effective analysis; few actually can, but there is a popular misconception that everybody can do it. Studies have shown, however, that analysis is a task learned over time and through experience, and some will demonstrate that they have learned it better than others.


At this level, it is critical to portray what is difficult about the basic work process of converting data or information into analytical outputs of value. We suggest that the following five factors are those most commonly associated with ineffective analytical outcomes:

Analysis is hard to separate from the larger intelligence process of planning, collecting, and decision-making. Where does analysis start and finish? Without everything coming together well, analysis may often be seen as the culprit.

Analysis is not repetitive – what works the first time often does not work the next time as key intelligence topics and the environment change. There is a difference between breadth and depth, and analytical tasks vary from project to project.

Analysts rely upon good data-collection of both hard information and soft information (human intelligence). If only hard data is collected from, say, the Internet, then the analytical task will be biased and, in turn, so will the output. Remember the old GIGO adage – garbage in, garbage out. Unless analysis encompasses both hard and soft information, garbage will come out.

Analysts don’t make the decisions. Their task is to advise and/or recommend. However, they are often blamed for failures but rarely given praise for successes.

The task of analysis requires a delicate balance of art (creativity, insightfulness, resourcefulness) and science (methods, techniques, processes) that few can effectively manage.


All tasks or processes are performed within a larger context. In the case of analysis, it is vital that one characterize the factors that exist in or because of the organization and prevent analysts and their analysis from creating value. The following six factors are offered as being those most commonly associated with ineffective analytical outcomes.

Decision-makers do not appreciate analysis – how can they when they think they can make good decisions without it?

Decision-makers often cannot specify the Key Intelligence Topics (KITs) or critical intelligence needs (CINs). The output of analysis needs to provide an opportunity for decision-makers to take actions. Unless the decision is clearly defined up-front, the odds of delivering actionable information at the other end are limited.

Analysis is always under-resourced either in time, technology applications, or people. One of the reasons that this occurs is that companies have not been able to answer questions about how it is budgeted, what kind of return it generates, and how many people and hours should be devoted to it.

There is usually little time for analysis but there always seems to be enough time for quick or poor decision-making. “Thinking time” is not rewarded in organizations or society today, as it is seen as being an unproductive use of time. Unfortunately, in too many instances analysis comes out as the poor second cousin to the data collection. Yet senior executives are willing to spend millions on bare threads of information or even a hunch, while thousands of families are dependent on these executives for their livelihoods. Thinking is the cornerstone of effective analysis. Without it, companies are still at risk.

Analysts must deliver the insights objectively; unfortunately, the analyst is often perceived to be the deliverer of “bad” news. In a workplace environment, where organizational cultures and reward systems look to people to work together in teams and to not rock the boat or break the china, many analysts suffer the consequences of showing that the emperor has no clothes or that her/his pet project is hopeless.

Good relationships are based on trust, and sufficient time needs to be allocated internally in developing trust in corporate relationships. This trust factor is an internal prerequisite of an effective framework for delivery and acceptance of the analytical insight.


It is critical that we characterize the factors that exist in or because of an organization’s environment. Since analysts operate in this environment, certain external factors may prevent analysts and their analysis from creating value. Based on our experiences, we propose the following external factors as being those most commonly associated with ineffective analytical outcomes:

With the ever-growing range of external competitive factors such as new technology, global competitors, new market entrants, and new market opportunities, the scope of analytical effort has been enlarged and has become more complicated.

Water, water everywhere but not a drop to drink! With increasing information overload and data communications, the analytical process can become overwhelmed with the mass of information available. The problem is that most of it is useless and most of it cannot be brought to bear on the decision-making process.

Globalization creates new complexities. The blurring of markets, industry and geographic boundaries, differing forms of competition, new competitive principles and values, etc., add to the requirements of the analytical framework.

Information technology systems for data collectors are everywhere, and organizations rush to the doors of software vendors to improve their technology systems. Yet there is a vacuum of systems developed for the analytical task. It is hard for analysts to guide systems experts in the development of effective information capture for analytical purposes when analysis is a mixture of both scientific and creative methodologies.

Education today is focussed on skill building, but a key requirement for analysis is thinking. In fact, analysis and thinking are being taught less and less at both graduate and postgraduate levels and it cannot be automatically assumed they were ever taught effectively in the first place (Werther, 2001). Analysis itself is not being taught as a skill in its own right.


As mentioned previously, four primary areas impact the quality of analysis – the analysts themselves, the analysis task, the internal organizational environment, and the external environment. To improve the quality and task of analysis, these four areas need to be addressed:

To decision-makers, intelligence is empowering. Without intelligence, a decision-makerm cannot take responsibility; with it, he/she cannot avoid taking responsibility. Clearly, the better that organizational decision-makers are provided with insight, the better they and their organizations will perform. This is why the importance of intelligence analysis needs to be recognized in its own right – both analysts and executives need to promote the reality and truth that analysis is critical to an organization’s competitive market success. Analysts and their clients need to realize, get comfortable with, and publicize to others that analysis is a discipline (field) of its own, with analysts who are professionals with unique knowledge, skills, training, and abilities.

Managers must come to understand the value of analysis and the empowerment it gives them in decision-making. They need to realize that effective analysis cannot be gotten through “quick fixes” or by the simple addition of new software or hardware applications. The value of analysis is in the insight it gives decision-makers, which ultimately benefits their bottom lines. Good analysis is good for the bottom line. Providing managers with case studies and examples of good and bad analytical outputs can do this. By avoiding the cost of wrong decision-making (Sawyer, 1998), the costs (jobs) saved/revenues (market share) gained by making better decisions or avoiding worse ones will significantly lower the degree of controllable or perceived risks associated with decisions.

Universities, employers and CI educators must develop dedicated courses and programs on analysis, not just the larger or broader CI process. These courses should include the following items: coverage of the techniques and methods of strategic and competitive analysis (see Fleisher & Bensoussan, 2002), opportunities to practice in performing analysis through case studies, processes for helping clients to clarify and elucidate their KITs, exposure to a range of philosophies and thinking styles such as those that would be gained through exposure to the basic tenets of epistemological science, communication processes to deliver analytical outputs to the client, and exposure to methods of understanding the research on analytical failure (the pitfalls of analysis) and its associated psychological, social, and cognitive origins. Courses should be offered both in classroom and JIT (electronic) formats. Some courses should be designed to reinforce existing knowledge about the analysis process, while others should be designed to impart new knowledge of tools and techniques as it develops. Analysts should be encouraged to take these courses on a regular basis and be given incentives for successfully upgrading their knowledge.

Measure people’s analytic proclivities – the capability of an analyst can be measured, and what can be measured can be both managed and improved. The development of capability measurement tools and metrics in order to demonstrate the improvement of the analyst’s capabilities should be strongly encouraged. There is also a need to measure analysis products against benchmarks – the best practices in the analytic field need to be identified, and the analysts’ or organization’s practice needs to be compared to these.

Executives must place analysts in organizational positions where they can make a difference and are secure. They need to be involved in the networks of collectors and clients but also be given the time needed to properly do their work.

 • As even the most effective analysts can provide inaccurate insight at times, decision-making clients need to give their analysts opportunities to fail and to demonstrate that they have learned from these experiences. Without the security and trust of their clients, analysts cannot help but be ineffective.

Employer organizations need to provide analysts with the proper tools (analytic applications, proper data inputs, access to sources, etc.). Analysts cannot be expected to provide insight without having access to rich sources of data, enabling technology, the open door of their organizational colleagues, and clearly articulated KITs or CINs.

In a communications-rich world, the goal of the analyst must be to create insight, not just to impart data or facts. These insights must be presented in ways that are so compelling and convenient that his or her clients cannot help but make use of them. The analyst’s efforts must always be guided by the client’s KITs/CINs. The outputs must be focussed to capture the client’s imagination and provide insights about complex issues quickly, yet in a comprehensive way. The analyst’s job must not be to intimidate clients with information, but rather to entice them with it.

CI analysts and their decision-making clients should be careful not to overrate or overemphasize the analysis of organizations, industries, and markets that is provided by financial analysts. They are primarily concerned with short-term financial gains, and not necessarily with long-term competitiveness.


Analysis is a critical component in aiding executives in their decision-making. This chapter has identified much of what is wrong with analysis today, but there is much that can also be done to improve analysis. These problems can be fixed, although this will require many people to make effective efforts.

CI scholar Ben Gilad (1994) notes that intelligence is an insight about externally motivated change and future developments and their implications to the organization. Done well, intelligence helps the organization reduce its risk level in dealing with the outside. Without analysis, there is little insight. Without good CI, a firm is increasingly vulnerable to attack in a globalized world economy.


D’Aveni, R.A. (1994). Hypercompetition: Managing the Dynamics of Strategic Maneuvering. New York: The Free Press.

Drucker, P. as quoted in Davenport, T. (1997). “A Meeting of the Minds,” CIO Magazine 10(21): 46-54.

Fahey, L. (1998). Outwitting, Outmaneuvering, and Outperforming Competitors. New York, NY: John Wiley and Sons, Inc.

Fleisher, C. S. (2001). “Analysis in Competitive Intelligence: Process, Progress, and Pitfalls,” in Managing Frontiers in Competitive Intelligence, ed. C.S. Fleisher and D.L. Blenkhorn. Westport, CT: Quorum Books.

Fleisher, C.S. and B. Bensoussan. (2000). A FAROUT Way to Manage CI Analysis,” Competitive Intelligence Magazine 3(2): 37-40.

_______ . (2002). Strategic and Competitive Analysis: Methods and Techniques for Analyzing Business Competition. Upper Saddle River, NJ: Prentice Hall.

Ghoshal, S. and D.E. Westney. (1991). “Organizing Competitor Analysis Systems,” Strategic Management Journal 12(1): 17-31.

Gilad, B. (1994). Business Blindspots: Replacing Your Company’s Entrenched and Outdated Myths, Beliefs, and Assumptions with the Realities of Today’s Markets. Chicago, IL: Probus.

Hamel, G. (1996). “Strategy as Revolution,” Harvard Business Review 74(4): 69-82.

Hamel, G. and C.K. Pralahad. (1994). Competing for the Future. Boston, MA: Harvard Business School Press.

Hammonds, K.H. (2001). “Michael Porter’s Big Ideas,” Fast Company 44: 150-156.

Herring, J.P. (1998). “What is Intelligence Analysis?” Competitive Intelligence Magazine 1(2):13-16.

Langley, A. (1995). “Between Paralysis by Analysis and Extinction by Instinct,” Sloan Management Review 36(3): 63-76.

Miller, G.A. (1956). “Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information,” The Psychological Review 63(2): 81-97.

Porter, M.E. (1980). Competitive Strategy: Techniques for Analyzing Industries and Competitors. New York, NY: The Free Press.