Last week I

wrote a post as the first step to help solve a problem I see most people have --

**creating accurate and reliable estimates**. The technique is called

**confidence-based estimating (CBE)**. When building a product, company, or just about anything, how or why do you expect people to believe your estimate if you are really just "winging it?" In

last week's post I started by saying there were 2 easy questions you need to ask to increase estimating accuracy quickly.

I said that the first question you need to ask is:

**What’s your level of confidence in your estimate?**

It looks like a simple little question and it is. But dozens of times I have seen this "little question" be the first huge step to creating highly accurate estimates. For a detailed discussion on the question above, please read the post

here.

I promised at the end of that post to "take you through the

2nd killer question to ask." So, here we go...

The

2nd Killer Question to Ask:

**What assumptions are you making?**

Again, it looks like another simple, innocuous question. But wait until you see what happens as the question starts getting answered! Here's how it works...

It's great to get a detailed list of all the assumptions someone is making

** **when they are creating an estimate. In my

first post on estimating I talked about asking the person to give you their estimate at a "

**95% level of confidence**" (after they've given you their initial estimate). When creating that estimate, make sure to

**capture all of the assumptions the person or team is making**.

Categorizing Your Assumptions

I would strongly encourage you to think about classifying / categorizing assumptions. Here is a simple (but not complete) example for a software development project:

**Example 1: Assumption Categories for a Software Development Project**

- Methodology (e.g., Agile Development)

- Development Framework (e.g, Ruby on Rails, Django)

- Development Environment
- Requirements & Scope
- API's
- Widgets
- Interfaces
- Staffing (on shore / off shore)
- QA and Testing
- Other

You would use something like the above to ask about assumptions by category when you are going through the estimating process and trying to get to a 95% level of confidence. The questions would be phrased / presented using the Assumption Categories using the following type of pattern... "

**What <Assumption Category> assumptions are we making when we think through this estimate?**" So, based on the Assumption Categories listed above you would end up with questions like: "What Requirements and Scope assumptions are we making when we think through this estimate?" "What

API assumptions are we making when we think through this estimate?" And so on..

**Example 2: Assumption Categories for ****an early stage Web 2.0 company:**

*This is just a subset of the assumption categories and yes, there are overlaps in the below...*

- Management team
- Marketing
- Positioning
- Competition
- Pricing
- Cost of customer acquisition
- Viral coefficient
- Customer churn
- Customer adoption

- Development environment
- Development team
- Product development
- API's / widgets / interfaces (inbound and outbound)
- Business model
- Financial
- Funding / valuation / capital requirements
- Other

Again, use the following type of pattern with the assumption categories.. "

**What <Assumption Category> assumptions are we making when we think through this estimate?**" So questions look like "What pricing assumptions are we making when we think through this estimate?" "What competition assumptions are we making when we think through this estimate?" "What product development assumptions are we making when we think through this estimate?" And so on.

**Assumptions = Risks**All of the assumptions that you come up with are

**risks that need to be managed**. As

Josh Kopelman likes to point out when he and

his firm are looking to invest in early stage companies, one of the things he is trying to do is

"de-risk" a company/idea at the earliest possible time and as capitally-efficient as possible. I urge you to spend time trying to flush out the assumptions and risks.

So, think about how you can

**verify or de-risk your assumptions**. How can you really justify your assumptions? How can you be sure?

Recently, I was listening to a pitch and I quickly went through this confidence-based estimating process with the entrepreneur when they stated their estimated product launch date. This was a May 2008 meeting and the launch was forecast for Q1 2009 (so let’s call that March 31, 2009 – 10 months later). Assumptions came back along the lines of:

1. Software will be ready

2. Hardware will be ready

3. We’ll have the management team on board

4. We’ll have the financing closed by the end of June 2008.

Those are 4 huge assumptions / risks. Expectations were being set. Were they accurate? I asked him for his

confidence level in his estimate. His response: "65 - 70% confident." I asked him for a 95% level of confidence now that he was

**thinking through his assumptions**. He added 6 months to his
estimate with a new projected launch date of September 2009. That's a huge change! Think about if he had not really thought through his estimate, raised his first round of financing, and subsequently ran out of cash after missing his originally forecast launch date. Good time to be raising more funds? Don't think so. You don't want to have a great idea but run out of cash due to estimating problems and end up

here.

Please post in the comments your experiences on how confidence-based estimating works for you.

I'll do one more post on estimating with some final thoughts. Stay tuned...

## Recent Comments