2 DESIGNING THE ANALYSIS
К оглавлению1 2 3 4 5 6 7 8 9 10 11 12 13 14When you form an initial hypothesis, you are “solving the
problem at the first meeting.” If only it were that easy.
Unfortunately, although you may think you have the answer (who
knows—you actually might), you have to prove it. You will do so
through fact-based analysis.
In their first few years at the Firm, McKinsey-ites focus on
analysis as their primary task. In fact, among the criteria the Firm uses in entry-level recruiting, analytical ability stands at or near the
top. Even partners and directors are judged on their ability to make
value-added recommendations based on the analyses performed by
their teams.
There’s a saying among small-aircraft pilots, “There are two
types of pilots: those who’ve landed with their landing gear
retracted and those who will.” The same relationship holds for
decision making: sooner or later every executive has to make a
major decision based on gut instinct. In many organizations executives
make major strategic decisions based as much on gut instinct
as on fact-based analysis. Almost all the McKinsey alumni we
interviewed found this a radical change from their time at the Firm.
Not that this is necessarily bad. In many cases time and resource
constraints don’t allow for lots of analysis. Many successful managers
have developed highly accurate instincts that allow them to
reach good decisions quickly—that’s why they’re successful managers.
Still, if you are not that experienced or would just like to
have a second opinion (in addition to your gut), we recommend
that you avail yourself of as much fact-based analytical support for
your decisions as your situation allows. Who knows, sometime it
just might remind you to lower your landing gear.
Our discussion of analysis has two distinct parts. In this chapter,
we show you how to lay out the analytical tasks that you and
your team must perform to prove your initial hypothesis. In Chapter
4, we will show you how to interpret the results of those analyses
in ways that maximize their impact on your client or
organization. In between, in Chapter 3, we will discuss the fine art
of data gathering, since you have to have something to analyze in
the first place, before you can get results.
What we call designing the analysis is referred to within
McKinsey as “work planning.” Work planning is usually the job of
the engagement manager (EM) running the day-to-day operation
of the team. Early on in the engagement, generally right after the
team has taken a stab at an initial hypothesis, the EM will determine
what analyses need to be done and who will take responsibility
for them. She will discuss with each team member what that
person’s tasks are, where to look for the data needed to complete
them, and how the likely end product should look. Then the team
members go off in their separate directions to get the job done.
For most businesses everything needs to be done yesterday if
not sooner, and for free. Unfortunately, rigorous, fact-based analysis
takes time. As any executive who has hired McKinsey will tell
you, that time is expensive. The Firm, however, realizes that its
clients can pay just so much, so it has developed many techniques
to help a team move quickly from raw facts to value-added recommendations.
These techniques work just as well outside
McKinsey’s walls. We can’t promise that you’ll be able to work
miracles by the time you finish this chapter, but if you apply the
lessons we present, you should be able to plot a course that will
speed up your analysis and decision making.
THE McKINSEY WAY
The following guidelines help McKinsey-ites plot their analytical
courses.
Find the key drivers. The success of most businesses depends
on a number of factors, but some are more important than others.
When your time and resources are limited, you don’t have the
luxury of being able to examine every single factor in detail.
Instead, when planning your analyses, figure out which factors
most affect the problem, and focus on those. Drill down to the core
of the problem instead of picking apart each and every piece.
Look at the big picture. When you are trying to solve a difficult,
complex problem, you can easily lose sight of your goal amid
Designing the Analysis 33
the million and one demands on your time. When you’re feeling
swamped by it all, take a metaphorical step back, and figure out
what you’re trying to achieve. Ask yourself how the task you are
doing now fits into the big picture. Is it moving your team toward
its goal? If it isn’t, it’s a waste of time, and time is too precious to
waste.
Don’t boil the ocean.Work smarter, not harder. In today’s datasaturated
world, it’s easy to analyze every aspect of a problem six
ways to Sunday. But it’s a waste of time unless the analyses you’re
doing add significant value to the problem-solving process. Figure
out which analyses you need in order to prove (or disprove) your
point. Do them, then move on. Chances are you don’t have the luxury
to do more than just enough.
Sometimes you have to let the solution come to you. Every set
of rules has exceptions, and the McKinsey problem-solving process
is no different in this regard. Sometimes, for whatever reason, you
won’t be able to form an initial hypothesis. When that’s the case,
you have to rely on your analysis of the facts available to point
your way to an eventual solution.
LESSONS LEARNED AND IMPLEMENTATION
ILLUSTATIONS
In their post-McKinsey careers, most of our alumni have a lot less
time to devote to analysis than they did at the Firm. Still, they find
that the knowledge they gained about designing analysis plans has
helped them get the factual support they need to make decisions
in their new organizations. We’ve distilled their experiences into
four lessons that will help you speed up your decision-making
cycle:
• Let your hypothesis determine your analysis.
• Get your analytical priorities straight.
• Forget about absolute precision.
• Triangulate around the tough problems.
Let your hypothesis determine your analysis. Once you start to
plan your analyses, you have to balance intuition against data. Historically,
the McKinsey problem-solving process left no place at
all for intuition, although there are indications that, in the New
Economy, even McKinsey has come to rely on gut instinct when
blazing completely new trails. In contrast, many decision makers
prefer to rely almost exclusively on their intuition, especially when
time is short. As one McKinsey alumnus noted, “People understand
that forming a hypothesis means being results oriented: figure
out where you want to go, and determine whether you’re on
the right track. Often, however, they don’t want to take the time to
do the little checkoffs to make sure they have the right solution.”
Although we understand why this is, we believe intuition and data
complement each other. You need at least some of each to have a
solid basis for your decisions.
The key to striking the balance is quality over quantity. In the
words of James G. Whelan at L, G, & E Energy, “Focused analysis
is more important than volume, and this stems from good initial
problem framing.” As we stated in Chapter 1, if you have correctly
designed your issue tree, then you should already know what
analyses you need to perform. You should have broken down the
problem into issues and the issues into subissues. At some point—
it may be two levels down the tree or maybe a dozen—the issues
will have resolved themselves into a set of questions that can be
answered yes or no (e.g., Will the product make money? Do we have the skills to implement the new program? Is it legal?). You
will have formed initial hypotheses as to what the answers are;
now you must support or refute those hypotheses with fact-based
analyses.
Another way to focus your analysis is, as Jeff Sakaguchi at
Accenture recommends, starting with the end in mind:
The process that we go through of issue, subissue, hypothesis,
analysis, data collection, end product makes you understand
what the end product is likely to look like. It keeps you
from doing a bunch of analysis that is interesting, intellectually
stimulating, but not very relevant. If you start doing
that, you can get beat up in a hurry.
Jeff points out a real danger for those of us who actually enjoy
analysis: getting caught up in analysis for its own sake. There’s a
lot of data out there, and it can be a lot of fun to play around with
it in all sorts of new and different ways. Unfortunately, if these
analyses aren’t working to prove or disprove your hypothesis, then
they are just that: playing around.
Get your analytical priorities straight. When you have limited
time to reach a conclusion and limited resources to attack the
problem, you have to figure out which analyses are indispensable
and which are simply gravy. As one of your first steps in designing
your analysis, you should therefore figure out what not to do.
This is the corollary of letting your hypothesis determine your
analysis: avoiding analyses that don’t relate to your hypothesis.
This holds especially true for small businesses with limited
resources. They can’t afford to boil the ocean, as Bob Buchsbaum,
CEO of art supplies retailer Dick Blick Holdings, attests in describing
his decision-making process:
Look for the path of least resistance by being hypothesis driven;
make assumptions and get answers that are “directionally correct.” We had a saying, “There is never enough data
or enough time,” which I always interpreted as, “Take
action earlier rather than later.” With a small business—$90
million in revenue—I can’t let myself or my staff violate these
lessons. Over and over, I find myself stopping people from
building the “unifying theory” of the business.
As we discussed in the previous section, analytically minded
people face a great temptation to do analyses that are interesting,
rather than relevant. In designing your analysis plan, it is your
responsibility to curtail this tendency in your team and, most especially,
in yourself.
As your next step, you should figure out which analyses are
quick wins—easy to complete and likely to make a major contribution
to proving or refuting the initial hypothesis. In other words,
as we say in Chapter 7, pluck the low-hanging fruit. As an example
of how to think about this, Chacko Sonny of Savage Entertainment
describes how his team attacks debugging, a crucial step
in the development of any software product:
Quality assurance for software in the early stages of testing
is definitely centered on this principle. While we have to be
exhaustive when searching for bugs in our software, and we
can’t afford to have 20 percent of the bugs slip through into
a released product, the 80/20 rule* does apply when searching
for the cause of a bug. In many cases, the same error in
the code will cause a number of different symptoms. Rather
than tracking down every single incarnation of the error, we
will uncover 80 percent of the effects of a major bug. This
will offer clues as to the cause of the errors. We can address
a large problem in the code without having enumerated
every single effect of the bug. Early on, we try to catch the
critical bugs with widespread implications for the product.
Toward the end of the process, we catch the remaining 20
percent of issues, which allows us to tweak the product into
releasable form.
By avoiding unnecessary analyses and focusing first on the easy
wins, you put yourself in a position to get a lot done in a short
time.
Forget about absolute precision. Because we stress the importance
of fact-based analyses in making business decisions, you
might think we’re contradicting ourselves to say that you don’t
need precise answers from your analyses. The truth is, however,
that business, for the most part, is not an exact discipline like
physics or math. Deciding whether to open a new factory requires
a different level of precision than discovering a new subatomic particle.
In fact, in most situations, achieving a scientific level of exactitude
for your management decisions is counterproductive. You
will spend an inordinate amount of time and effort getting from
mostly right to, in all likelihood, precisely wrong. Bear this in mind
when determining the analysis tasks for your problem.
This is especially true with forward-looking analysis. It’s one
thing to assemble historical data to answer a question such as
“How large is the widget market?” It’s quite another to answer a
question like “What is the likely return over the next 10 years if we
build a new widget plant in Upper Sandusky?” The answer to that
question depends on a great many variables, the values of which
it is impossible to know: future widget demand, arrival of new
competitors, changing consumer tastes, etc. Any number that you
can come up with will most likely be wrong. Therefore, you should
just try to get an answer that is in the “comfort zone”—directionally
correct and of the right order of magnitude. Often you can reach an answer of that level of precision very quickly, while
attaining spurious precision would take much longer.
Also, if you can achieve some sort of satisfactory answer in a
short time, then you are much more likely to attempt the analysis
than you would if you had to get an answer to four decimal places.
As one of our alumni puts it:
I find back-of-the-envelope analysis incredibly valuable
because it lets you know if you’re in the ballpark. A lot of the
time, all I want to know is whether, say, a new product idea
is going to be worth $5 million, $50 million, or $500 million.
And some people find it very difficult to get comfortable
with that. They think, “Oh, I’m going to say $50
million; what if it’s really $75 million?” I don’t care! “But it’s
50 percent off!” they say. I respond that it’s so much more
valuable than not putting together a number at all.
Just as some people want to do every analysis under the sun,
there are people who just have to get their answers correct to four
significant figures. Naras Eechambadi, founder and CEO of
Quaero, Inc., an information-based marketing consultancy, knows
all about that from the inside:
I hire a lot of Ph.D.s and advanced-degree holders, and I
almost have to force them not to look at every error pattern
in the data. All that stuff your professors taught you is great
if you’re talking about health care and you have to worry
about people dying. But this is marketing; we’re just trying
to make a buck. Let’s get the show on the road and stop worrying
about all the nuances.
You can spend a lot of time improving the precision of
your models, but eventually you reach the point of diminishing
returns or you lose time to market. We don’t need to have the perfect model. We just need to have something
that’s better than what we have today. Let’s go out and make
some money, and then we can continue to make it better
over time.
Once again, it is up to you to resist the impulse to get lost in the
data, whether in yourself or your team, because it will cost you
time and money.
Triangulate around the tough problems. In surveying and
mapmaking, triangulation is the method of determining the precise
location of an unknown point by taking measurements from two
known points. You can use an analogous technique to form a
hypothesis when you have very little information about the problem
at hand—a very common occurrence in business. At some
point you will come up against a question that appears unanswerable.
Either the data are proprietary to your fiercest competitor,
or you’re breaking entirely new ground in your industry, or for
whatever reason the question is just too tough to crack. If that’s the
case, don’t despair. Chances are you can come up with some analyses
that will at least allow you to scope out the likely limits of the
answer, even if they won’t get you particularly close. Once again, if
you’re directionally correct and in the right order of magnitude,
chances are that’s enough to make a decision.
To illustrate how this might be done, we’d like to present an
example from our alumnus at GlaxoSmithKline, Paul Kenny. He
had to determine the potential market size for a drug that had yet
to be developed and that treats a condition most doctors don’t even
recognize. His strategy gives an insight into how you might tackle
a similar situation:
We’re looking into a condition called hypoactive sexual
desire disorder (HSDD), which is an abnormally low level of
sexual desire, primarily in women. At this point, it’s not really an accepted disease. It’s been defined by psychiatrists
but is very rarely diagnosed; GPs have probably never even
heard of it. From a pharmaceutical point of view, it opens up
the opportunity for some sort of female Viagra. At this point,
there’s no information on it.
Undaunted by this difficult scenario, Paul looked for analogous
situations that might shed light on his problem:
We’ve tried to draw some parallels with Viagra for men as an
obvious link. Mainly, however, we’re looking for analogies
both with other sexual disorders and with what one might
call lifestyle issues—obesity, say, or other diseases. We may
be able to use these analogies to justify the business case.
Once Paul found some useful analogies, he looked for insights
from them:
One of the links we’re hypothesizing is resistance—reluctance
among patients to admit they have this condition.
How many patients are actually going to talk to their doctor
about it? At the moment, none of them do, so you can’t
use their history as an example. Of course, pre-Viagra, far
fewer men talked to their doctor about ED [erectile dysfunction].
Whether women have the same attitude as men
toward this remains an open question. On the mental side
we’re looking at obesity—patients have cravings, or they eat
because it is a habit, or they think they want to, so that’s
more of a mental phenomenon—and the extent to which
people admit they have obesity as a mental disease. There are
all sorts of analogies that we’re using to triangulate what sort
of numbers we might be looking at. Even if, at the end of
the day, we’ll never know precisely, we hope to be able to
come up with something in the ballpark.
As you can see, Paul’s not in the least concerned that he will
never reach “the answer.” Rather, he’s merely trying to establish
upper and lower bounds for the size of this particular market,
because that range will be enough for him to decide whether to
pursue this project.
IMPLEMENTATION GUIDANCE
When designing your analysis, you have a specific end product in
mind: your work plan. A comprehensive work plan begins with
all the issues and subissues you identified during the framing of
your initial hypothesis. For each issue or subissue, you should list
the following elements:
• Your initial hypothesis as to the answer
• The analyses that must be done to prove or disprove that
hypothesis, in order of priority
• The data necessary to perform the analysis
• The likely sources of the data (e.g., Census data, focus
groups, interviews)
• A brief description of the likely end product of each analysis
• The person responsible for each end product (you or a
member of your team)
• The due date for each end product
It doesn’t need to be fancy or formal. Hand-drawn is fine, as
long as it’s legible.
As an example, let’s return once more to Acme Widgets. When
we left your team there in the last chapter, you had just finished
your issue tree. We spent some time expanding one of the branches
of that tree—the issue of “Can we implement the necessary
changes?”—by dividing that issue into subissues expressed as
yes/no questions. Table 2-1 shows how you could lay out the work
plan for one of those subissues.
Issue/Hypothesis Analyses Data Sources End Product Responsibility Due Date
Can we implement the
necessary changes to the
production process? Yes
Does the new process require Technical Articles, interviews Chart Tom 3-Jun
special facilties? No Specifications
List of facilities Facilities management, List Tom 5-Jun
that meet new interviews
criteria
If it does require special Map of “facilities Facilities management, Chart Belinda 7-Jun
facilities, can we acquire gap” thrum-mat line superthem?
Yes visors, interviews
Sources of required Operations, trade List Belinda 7-Jun
facilities/equipment publications
Costs to fill gaps Operations, contractors, Table Belinda 10-Jun
interviews
Effect on project Finance department, Spreadsheet Terry 12-Jun
rate of return prior analysis
Table 2-1. Work Plan for Issue in Acme Widgets Issue Tree
Following the preceding list of elements in the analysis design,
we start by noting the issue to be analyzed and our hypothesis as to
the answer. We like to append our answer directly to the question,
although you could just as easily put it in a separate column. The
top-line issue goes (no surprise here) at the top. Beneath that,
indent and list the subissues, then do the same with sub-subissues
(not to mention sub-sub-subissues). Thus, the question “If it does
require special facilities, can we acquire them?” comes underneath
the question “Does the new process require special facilities?”
Next comes the list of analyses to be performed. In this example,
there aren’t many, but there could have been. For instance, it
might be useful to have a schematic diagram to go along with the
technical requirements for the new production process. Useful, yes,
even interesting, but not ultimately necessary, and someone would
have to take the time to put it together—time they wouldn’t spend
on actually proving or disproving the hypothesis. Therefore, doing
a schematic didn’t make the final cut, nor did a number of other
analyses that you might devise.
We’ll touch only briefly on the data and their sources, since
we will be covering that topic in detail in Chapter 3. Listing data
and sources helps you and your team cover all the bases so you will
be less likely to miss a rich source of information. Speaking of rich
sources of information, have you noticed how often interviews
come up? You’ll see a lot more about them in Chapter 3.
The description of the likely end product should be brief, as
in the example. These descriptions really serve as a departure point
for discussions within the team. At McKinsey, the EM takes each
team member through her part of the work plan and discusses her
expectations as to the end product. Sometimes, the EM will sketch
out a “ghost pack,” showing templates for each end product,
which can help guide the analytical process, especially for lessexperienced
consultants.
Responsibility is mostly self-evident. After all, someone has to
take charge of each analysis, or it won’t get done. We’ll cover the
question of how you assign the right people to the right tasks (and
get them on your team in the first place) in Chapter 6, “Managing
Your Team.” Usually, it makes sense to parcel out responsibility
for discrete chunks of the analysis (e.g., for each subissue) to
one person, but it’s not a requirement. Thus, in our example, Tom
is in charge of answering the question “Does the new process
require special facilities?” Belinda is on the hook for finding out
whether we can acquire any special facilities that we might need,
but one piece of that analysis goes to Terry. Why? As it happens,
Terry is our financial expert and is building an overall financial
model for the project, so it makes sense for Terry to analyze the
rate of return.
Due date, once again, is self-explanatory. Being specific about
dates helps the members of your team understand what is expected
of them and allows you to visualize the overall flow of the project
from start to finish. Some people like to track their due dates in
more detail with Gantt charts or other project management tools.
That’s up to you.
In our example, one analysis more or less dovetails neatly with
the next. Bear in mind, however, that sometimes the results of one
analysis will make a whole range of subsequent analyses redundant,
thus saving you the trouble of actually performing them. For
instance, if the analyses prove our initial hypothesis that we don’t
need special facilities, then the question of whether we can acquire
them—and all the attendant analyses—falls away. Thus, if you can,
you should schedule your analyses to let you answer these “dominant”
questions first. Of course, sometimes you don’t have the luxury
to wait for the results of one analysis before you start the next.
Still, make the most of opportunities to prune your analysis plan
aggressively.
Beyond laying out your life for the next several weeks and setting
expectations for your team, a good work plan has another feature:
it helps you structure your thinking. As you go through your
work plan, write down all the analyses, and prioritize and prune
them, you’ll quickly see whether there are holes in your initial
hypothesis that didn’t show up during the framing stage. One of
our alumni put it this way:
One of the most important things I’ve learned is that he who
puts it on paper first wins. And the corollary is that if you
can’t put it down on paper, then either you don’t have it clear
in your head or it’s not a good idea. There are a lot of people
who say, “Oh, I had this idea in my head, I just haven’t
put it down, but I really know exactly what I want to do.” I
say, put it on paper.
Sometimes, just the process of work planning will lead you to
revisit and possibly restructure your analysis. We will examine the
iterative relationship between hypothesis and analysis more in
Chapter 4. In the meantime, bear in mind that your initial hypothesis
is a living document, and it feeds off your analysis.
EXERCISES
• In Chapter 1, we laid out part of the Acme Widgets issue
tree regarding the question “Can we implement the necessary
changes to utilize the new process?” In this chapter,
we laid out a work plan for the subissue “Does it require
special facilities that we don’t have?” Do the same for the
other subissue in that discussion, “Does it require special
skills that we don’t have?” Remember that if the answer is
yes, you have to answer an additional question.
CONCLUSION
When it comes time to prove your initial hypothesis, efficient
analysis design will help you hit the ground running. You and your
team will know what you have to do, where to get the information
to do it, and when to get it done. The work-planning process
also serves as a useful reality check to the sometimes intellectualized
pursuit of the initial hypothesis. To some, it may seem a
slightly anal-retentive exercise, but we recommend it highly, and
our alumni can attest to its utility.
Once you’ve designed your work plan, it’s time to start filling
in the blanks. You can only do that with facts, so it’s time to start
gathering data. In the next chapter, we’ll take you through the
strategies and techniques you need to get the data for your
analysis.
This page intentionally left blank.
When you form an initial hypothesis, you are “solving the
problem at the first meeting.” If only it were that easy.
Unfortunately, although you may think you have the answer (who
knows—you actually might), you have to prove it. You will do so
through fact-based analysis.
In their first few years at the Firm, McKinsey-ites focus on
analysis as their primary task. In fact, among the criteria the Firm uses in entry-level recruiting, analytical ability stands at or near the
top. Even partners and directors are judged on their ability to make
value-added recommendations based on the analyses performed by
their teams.
There’s a saying among small-aircraft pilots, “There are two
types of pilots: those who’ve landed with their landing gear
retracted and those who will.” The same relationship holds for
decision making: sooner or later every executive has to make a
major decision based on gut instinct. In many organizations executives
make major strategic decisions based as much on gut instinct
as on fact-based analysis. Almost all the McKinsey alumni we
interviewed found this a radical change from their time at the Firm.
Not that this is necessarily bad. In many cases time and resource
constraints don’t allow for lots of analysis. Many successful managers
have developed highly accurate instincts that allow them to
reach good decisions quickly—that’s why they’re successful managers.
Still, if you are not that experienced or would just like to
have a second opinion (in addition to your gut), we recommend
that you avail yourself of as much fact-based analytical support for
your decisions as your situation allows. Who knows, sometime it
just might remind you to lower your landing gear.
Our discussion of analysis has two distinct parts. In this chapter,
we show you how to lay out the analytical tasks that you and
your team must perform to prove your initial hypothesis. In Chapter
4, we will show you how to interpret the results of those analyses
in ways that maximize their impact on your client or
organization. In between, in Chapter 3, we will discuss the fine art
of data gathering, since you have to have something to analyze in
the first place, before you can get results.
What we call designing the analysis is referred to within
McKinsey as “work planning.” Work planning is usually the job of
the engagement manager (EM) running the day-to-day operation
of the team. Early on in the engagement, generally right after the
team has taken a stab at an initial hypothesis, the EM will determine
what analyses need to be done and who will take responsibility
for them. She will discuss with each team member what that
person’s tasks are, where to look for the data needed to complete
them, and how the likely end product should look. Then the team
members go off in their separate directions to get the job done.
For most businesses everything needs to be done yesterday if
not sooner, and for free. Unfortunately, rigorous, fact-based analysis
takes time. As any executive who has hired McKinsey will tell
you, that time is expensive. The Firm, however, realizes that its
clients can pay just so much, so it has developed many techniques
to help a team move quickly from raw facts to value-added recommendations.
These techniques work just as well outside
McKinsey’s walls. We can’t promise that you’ll be able to work
miracles by the time you finish this chapter, but if you apply the
lessons we present, you should be able to plot a course that will
speed up your analysis and decision making.
THE McKINSEY WAY
The following guidelines help McKinsey-ites plot their analytical
courses.
Find the key drivers. The success of most businesses depends
on a number of factors, but some are more important than others.
When your time and resources are limited, you don’t have the
luxury of being able to examine every single factor in detail.
Instead, when planning your analyses, figure out which factors
most affect the problem, and focus on those. Drill down to the core
of the problem instead of picking apart each and every piece.
Look at the big picture. When you are trying to solve a difficult,
complex problem, you can easily lose sight of your goal amid
Designing the Analysis 33
the million and one demands on your time. When you’re feeling
swamped by it all, take a metaphorical step back, and figure out
what you’re trying to achieve. Ask yourself how the task you are
doing now fits into the big picture. Is it moving your team toward
its goal? If it isn’t, it’s a waste of time, and time is too precious to
waste.
Don’t boil the ocean.Work smarter, not harder. In today’s datasaturated
world, it’s easy to analyze every aspect of a problem six
ways to Sunday. But it’s a waste of time unless the analyses you’re
doing add significant value to the problem-solving process. Figure
out which analyses you need in order to prove (or disprove) your
point. Do them, then move on. Chances are you don’t have the luxury
to do more than just enough.
Sometimes you have to let the solution come to you. Every set
of rules has exceptions, and the McKinsey problem-solving process
is no different in this regard. Sometimes, for whatever reason, you
won’t be able to form an initial hypothesis. When that’s the case,
you have to rely on your analysis of the facts available to point
your way to an eventual solution.
LESSONS LEARNED AND IMPLEMENTATION
ILLUSTATIONS
In their post-McKinsey careers, most of our alumni have a lot less
time to devote to analysis than they did at the Firm. Still, they find
that the knowledge they gained about designing analysis plans has
helped them get the factual support they need to make decisions
in their new organizations. We’ve distilled their experiences into
four lessons that will help you speed up your decision-making
cycle:
• Let your hypothesis determine your analysis.
• Get your analytical priorities straight.
• Forget about absolute precision.
• Triangulate around the tough problems.
Let your hypothesis determine your analysis. Once you start to
plan your analyses, you have to balance intuition against data. Historically,
the McKinsey problem-solving process left no place at
all for intuition, although there are indications that, in the New
Economy, even McKinsey has come to rely on gut instinct when
blazing completely new trails. In contrast, many decision makers
prefer to rely almost exclusively on their intuition, especially when
time is short. As one McKinsey alumnus noted, “People understand
that forming a hypothesis means being results oriented: figure
out where you want to go, and determine whether you’re on
the right track. Often, however, they don’t want to take the time to
do the little checkoffs to make sure they have the right solution.”
Although we understand why this is, we believe intuition and data
complement each other. You need at least some of each to have a
solid basis for your decisions.
The key to striking the balance is quality over quantity. In the
words of James G. Whelan at L, G, & E Energy, “Focused analysis
is more important than volume, and this stems from good initial
problem framing.” As we stated in Chapter 1, if you have correctly
designed your issue tree, then you should already know what
analyses you need to perform. You should have broken down the
problem into issues and the issues into subissues. At some point—
it may be two levels down the tree or maybe a dozen—the issues
will have resolved themselves into a set of questions that can be
answered yes or no (e.g., Will the product make money? Do we have the skills to implement the new program? Is it legal?). You
will have formed initial hypotheses as to what the answers are;
now you must support or refute those hypotheses with fact-based
analyses.
Another way to focus your analysis is, as Jeff Sakaguchi at
Accenture recommends, starting with the end in mind:
The process that we go through of issue, subissue, hypothesis,
analysis, data collection, end product makes you understand
what the end product is likely to look like. It keeps you
from doing a bunch of analysis that is interesting, intellectually
stimulating, but not very relevant. If you start doing
that, you can get beat up in a hurry.
Jeff points out a real danger for those of us who actually enjoy
analysis: getting caught up in analysis for its own sake. There’s a
lot of data out there, and it can be a lot of fun to play around with
it in all sorts of new and different ways. Unfortunately, if these
analyses aren’t working to prove or disprove your hypothesis, then
they are just that: playing around.
Get your analytical priorities straight. When you have limited
time to reach a conclusion and limited resources to attack the
problem, you have to figure out which analyses are indispensable
and which are simply gravy. As one of your first steps in designing
your analysis, you should therefore figure out what not to do.
This is the corollary of letting your hypothesis determine your
analysis: avoiding analyses that don’t relate to your hypothesis.
This holds especially true for small businesses with limited
resources. They can’t afford to boil the ocean, as Bob Buchsbaum,
CEO of art supplies retailer Dick Blick Holdings, attests in describing
his decision-making process:
Look for the path of least resistance by being hypothesis driven;
make assumptions and get answers that are “directionally correct.” We had a saying, “There is never enough data
or enough time,” which I always interpreted as, “Take
action earlier rather than later.” With a small business—$90
million in revenue—I can’t let myself or my staff violate these
lessons. Over and over, I find myself stopping people from
building the “unifying theory” of the business.
As we discussed in the previous section, analytically minded
people face a great temptation to do analyses that are interesting,
rather than relevant. In designing your analysis plan, it is your
responsibility to curtail this tendency in your team and, most especially,
in yourself.
As your next step, you should figure out which analyses are
quick wins—easy to complete and likely to make a major contribution
to proving or refuting the initial hypothesis. In other words,
as we say in Chapter 7, pluck the low-hanging fruit. As an example
of how to think about this, Chacko Sonny of Savage Entertainment
describes how his team attacks debugging, a crucial step
in the development of any software product:
Quality assurance for software in the early stages of testing
is definitely centered on this principle. While we have to be
exhaustive when searching for bugs in our software, and we
can’t afford to have 20 percent of the bugs slip through into
a released product, the 80/20 rule* does apply when searching
for the cause of a bug. In many cases, the same error in
the code will cause a number of different symptoms. Rather
than tracking down every single incarnation of the error, we
will uncover 80 percent of the effects of a major bug. This
will offer clues as to the cause of the errors. We can address
a large problem in the code without having enumerated
every single effect of the bug. Early on, we try to catch the
critical bugs with widespread implications for the product.
Toward the end of the process, we catch the remaining 20
percent of issues, which allows us to tweak the product into
releasable form.
By avoiding unnecessary analyses and focusing first on the easy
wins, you put yourself in a position to get a lot done in a short
time.
Forget about absolute precision. Because we stress the importance
of fact-based analyses in making business decisions, you
might think we’re contradicting ourselves to say that you don’t
need precise answers from your analyses. The truth is, however,
that business, for the most part, is not an exact discipline like
physics or math. Deciding whether to open a new factory requires
a different level of precision than discovering a new subatomic particle.
In fact, in most situations, achieving a scientific level of exactitude
for your management decisions is counterproductive. You
will spend an inordinate amount of time and effort getting from
mostly right to, in all likelihood, precisely wrong. Bear this in mind
when determining the analysis tasks for your problem.
This is especially true with forward-looking analysis. It’s one
thing to assemble historical data to answer a question such as
“How large is the widget market?” It’s quite another to answer a
question like “What is the likely return over the next 10 years if we
build a new widget plant in Upper Sandusky?” The answer to that
question depends on a great many variables, the values of which
it is impossible to know: future widget demand, arrival of new
competitors, changing consumer tastes, etc. Any number that you
can come up with will most likely be wrong. Therefore, you should
just try to get an answer that is in the “comfort zone”—directionally
correct and of the right order of magnitude. Often you can reach an answer of that level of precision very quickly, while
attaining spurious precision would take much longer.
Also, if you can achieve some sort of satisfactory answer in a
short time, then you are much more likely to attempt the analysis
than you would if you had to get an answer to four decimal places.
As one of our alumni puts it:
I find back-of-the-envelope analysis incredibly valuable
because it lets you know if you’re in the ballpark. A lot of the
time, all I want to know is whether, say, a new product idea
is going to be worth $5 million, $50 million, or $500 million.
And some people find it very difficult to get comfortable
with that. They think, “Oh, I’m going to say $50
million; what if it’s really $75 million?” I don’t care! “But it’s
50 percent off!” they say. I respond that it’s so much more
valuable than not putting together a number at all.
Just as some people want to do every analysis under the sun,
there are people who just have to get their answers correct to four
significant figures. Naras Eechambadi, founder and CEO of
Quaero, Inc., an information-based marketing consultancy, knows
all about that from the inside:
I hire a lot of Ph.D.s and advanced-degree holders, and I
almost have to force them not to look at every error pattern
in the data. All that stuff your professors taught you is great
if you’re talking about health care and you have to worry
about people dying. But this is marketing; we’re just trying
to make a buck. Let’s get the show on the road and stop worrying
about all the nuances.
You can spend a lot of time improving the precision of
your models, but eventually you reach the point of diminishing
returns or you lose time to market. We don’t need to have the perfect model. We just need to have something
that’s better than what we have today. Let’s go out and make
some money, and then we can continue to make it better
over time.
Once again, it is up to you to resist the impulse to get lost in the
data, whether in yourself or your team, because it will cost you
time and money.
Triangulate around the tough problems. In surveying and
mapmaking, triangulation is the method of determining the precise
location of an unknown point by taking measurements from two
known points. You can use an analogous technique to form a
hypothesis when you have very little information about the problem
at hand—a very common occurrence in business. At some
point you will come up against a question that appears unanswerable.
Either the data are proprietary to your fiercest competitor,
or you’re breaking entirely new ground in your industry, or for
whatever reason the question is just too tough to crack. If that’s the
case, don’t despair. Chances are you can come up with some analyses
that will at least allow you to scope out the likely limits of the
answer, even if they won’t get you particularly close. Once again, if
you’re directionally correct and in the right order of magnitude,
chances are that’s enough to make a decision.
To illustrate how this might be done, we’d like to present an
example from our alumnus at GlaxoSmithKline, Paul Kenny. He
had to determine the potential market size for a drug that had yet
to be developed and that treats a condition most doctors don’t even
recognize. His strategy gives an insight into how you might tackle
a similar situation:
We’re looking into a condition called hypoactive sexual
desire disorder (HSDD), which is an abnormally low level of
sexual desire, primarily in women. At this point, it’s not really an accepted disease. It’s been defined by psychiatrists
but is very rarely diagnosed; GPs have probably never even
heard of it. From a pharmaceutical point of view, it opens up
the opportunity for some sort of female Viagra. At this point,
there’s no information on it.
Undaunted by this difficult scenario, Paul looked for analogous
situations that might shed light on his problem:
We’ve tried to draw some parallels with Viagra for men as an
obvious link. Mainly, however, we’re looking for analogies
both with other sexual disorders and with what one might
call lifestyle issues—obesity, say, or other diseases. We may
be able to use these analogies to justify the business case.
Once Paul found some useful analogies, he looked for insights
from them:
One of the links we’re hypothesizing is resistance—reluctance
among patients to admit they have this condition.
How many patients are actually going to talk to their doctor
about it? At the moment, none of them do, so you can’t
use their history as an example. Of course, pre-Viagra, far
fewer men talked to their doctor about ED [erectile dysfunction].
Whether women have the same attitude as men
toward this remains an open question. On the mental side
we’re looking at obesity—patients have cravings, or they eat
because it is a habit, or they think they want to, so that’s
more of a mental phenomenon—and the extent to which
people admit they have obesity as a mental disease. There are
all sorts of analogies that we’re using to triangulate what sort
of numbers we might be looking at. Even if, at the end of
the day, we’ll never know precisely, we hope to be able to
come up with something in the ballpark.
As you can see, Paul’s not in the least concerned that he will
never reach “the answer.” Rather, he’s merely trying to establish
upper and lower bounds for the size of this particular market,
because that range will be enough for him to decide whether to
pursue this project.
IMPLEMENTATION GUIDANCE
When designing your analysis, you have a specific end product in
mind: your work plan. A comprehensive work plan begins with
all the issues and subissues you identified during the framing of
your initial hypothesis. For each issue or subissue, you should list
the following elements:
• Your initial hypothesis as to the answer
• The analyses that must be done to prove or disprove that
hypothesis, in order of priority
• The data necessary to perform the analysis
• The likely sources of the data (e.g., Census data, focus
groups, interviews)
• A brief description of the likely end product of each analysis
• The person responsible for each end product (you or a
member of your team)
• The due date for each end product
It doesn’t need to be fancy or formal. Hand-drawn is fine, as
long as it’s legible.
As an example, let’s return once more to Acme Widgets. When
we left your team there in the last chapter, you had just finished
your issue tree. We spent some time expanding one of the branches
of that tree—the issue of “Can we implement the necessary
changes?”—by dividing that issue into subissues expressed as
yes/no questions. Table 2-1 shows how you could lay out the work
plan for one of those subissues.
Issue/Hypothesis Analyses Data Sources End Product Responsibility Due Date
Can we implement the
necessary changes to the
production process? Yes
Does the new process require Technical Articles, interviews Chart Tom 3-Jun
special facilties? No Specifications
List of facilities Facilities management, List Tom 5-Jun
that meet new interviews
criteria
If it does require special Map of “facilities Facilities management, Chart Belinda 7-Jun
facilities, can we acquire gap” thrum-mat line superthem?
Yes visors, interviews
Sources of required Operations, trade List Belinda 7-Jun
facilities/equipment publications
Costs to fill gaps Operations, contractors, Table Belinda 10-Jun
interviews
Effect on project Finance department, Spreadsheet Terry 12-Jun
rate of return prior analysis
Table 2-1. Work Plan for Issue in Acme Widgets Issue Tree
Following the preceding list of elements in the analysis design,
we start by noting the issue to be analyzed and our hypothesis as to
the answer. We like to append our answer directly to the question,
although you could just as easily put it in a separate column. The
top-line issue goes (no surprise here) at the top. Beneath that,
indent and list the subissues, then do the same with sub-subissues
(not to mention sub-sub-subissues). Thus, the question “If it does
require special facilities, can we acquire them?” comes underneath
the question “Does the new process require special facilities?”
Next comes the list of analyses to be performed. In this example,
there aren’t many, but there could have been. For instance, it
might be useful to have a schematic diagram to go along with the
technical requirements for the new production process. Useful, yes,
even interesting, but not ultimately necessary, and someone would
have to take the time to put it together—time they wouldn’t spend
on actually proving or disproving the hypothesis. Therefore, doing
a schematic didn’t make the final cut, nor did a number of other
analyses that you might devise.
We’ll touch only briefly on the data and their sources, since
we will be covering that topic in detail in Chapter 3. Listing data
and sources helps you and your team cover all the bases so you will
be less likely to miss a rich source of information. Speaking of rich
sources of information, have you noticed how often interviews
come up? You’ll see a lot more about them in Chapter 3.
The description of the likely end product should be brief, as
in the example. These descriptions really serve as a departure point
for discussions within the team. At McKinsey, the EM takes each
team member through her part of the work plan and discusses her
expectations as to the end product. Sometimes, the EM will sketch
out a “ghost pack,” showing templates for each end product,
which can help guide the analytical process, especially for lessexperienced
consultants.
Responsibility is mostly self-evident. After all, someone has to
take charge of each analysis, or it won’t get done. We’ll cover the
question of how you assign the right people to the right tasks (and
get them on your team in the first place) in Chapter 6, “Managing
Your Team.” Usually, it makes sense to parcel out responsibility
for discrete chunks of the analysis (e.g., for each subissue) to
one person, but it’s not a requirement. Thus, in our example, Tom
is in charge of answering the question “Does the new process
require special facilities?” Belinda is on the hook for finding out
whether we can acquire any special facilities that we might need,
but one piece of that analysis goes to Terry. Why? As it happens,
Terry is our financial expert and is building an overall financial
model for the project, so it makes sense for Terry to analyze the
rate of return.
Due date, once again, is self-explanatory. Being specific about
dates helps the members of your team understand what is expected
of them and allows you to visualize the overall flow of the project
from start to finish. Some people like to track their due dates in
more detail with Gantt charts or other project management tools.
That’s up to you.
In our example, one analysis more or less dovetails neatly with
the next. Bear in mind, however, that sometimes the results of one
analysis will make a whole range of subsequent analyses redundant,
thus saving you the trouble of actually performing them. For
instance, if the analyses prove our initial hypothesis that we don’t
need special facilities, then the question of whether we can acquire
them—and all the attendant analyses—falls away. Thus, if you can,
you should schedule your analyses to let you answer these “dominant”
questions first. Of course, sometimes you don’t have the luxury
to wait for the results of one analysis before you start the next.
Still, make the most of opportunities to prune your analysis plan
aggressively.
Beyond laying out your life for the next several weeks and setting
expectations for your team, a good work plan has another feature:
it helps you structure your thinking. As you go through your
work plan, write down all the analyses, and prioritize and prune
them, you’ll quickly see whether there are holes in your initial
hypothesis that didn’t show up during the framing stage. One of
our alumni put it this way:
One of the most important things I’ve learned is that he who
puts it on paper first wins. And the corollary is that if you
can’t put it down on paper, then either you don’t have it clear
in your head or it’s not a good idea. There are a lot of people
who say, “Oh, I had this idea in my head, I just haven’t
put it down, but I really know exactly what I want to do.” I
say, put it on paper.
Sometimes, just the process of work planning will lead you to
revisit and possibly restructure your analysis. We will examine the
iterative relationship between hypothesis and analysis more in
Chapter 4. In the meantime, bear in mind that your initial hypothesis
is a living document, and it feeds off your analysis.
EXERCISES
• In Chapter 1, we laid out part of the Acme Widgets issue
tree regarding the question “Can we implement the necessary
changes to utilize the new process?” In this chapter,
we laid out a work plan for the subissue “Does it require
special facilities that we don’t have?” Do the same for the
other subissue in that discussion, “Does it require special
skills that we don’t have?” Remember that if the answer is
yes, you have to answer an additional question.
CONCLUSION
When it comes time to prove your initial hypothesis, efficient
analysis design will help you hit the ground running. You and your
team will know what you have to do, where to get the information
to do it, and when to get it done. The work-planning process
also serves as a useful reality check to the sometimes intellectualized
pursuit of the initial hypothesis. To some, it may seem a
slightly anal-retentive exercise, but we recommend it highly, and
our alumni can attest to its utility.
Once you’ve designed your work plan, it’s time to start filling
in the blanks. You can only do that with facts, so it’s time to start
gathering data. In the next chapter, we’ll take you through the
strategies and techniques you need to get the data for your
analysis.
This page intentionally left blank.