This article assumes a software development project that is several months into a multi-year endeavor. Using rolling wave you, the project manager, look three to six months out and begin to decompose a significant portion of project scope. As a diligent PM you pull out the GAO Cost Estimating and Assessment Guide and print Figure 1: in preparation for the cost estimating working session scheduled on Tuesday for you and your team of experts.
The team has assembled. All safety communications are conveyed, the restroom code is on the white board, and pizza will arrive at 11:30. It is time to roll up your sleeves and get started. Since most of the team is familiar with the project, you state the focus of the working session will be on the green boxed steps in Figure 1.
Define the program
To define the program in this case involves the Project Work Breakdown Structure (PWBS) artifact recently configured with fresh updates and deliverables. Of course you can drive a bus through the high level summary statements, but the artifact is sure formatted pretty. You and the team determine that the Contractor Work Breakdown Structure (CWBS) dictionary will need exceptional assumption and rationale statements to substantiate the estimate.
Obtain the data
So as the experts read the scope artifacts, you ask your project controller to run a report to capture historical actuals on the previous software releases and to deliver an analysis (macro, micro, trends, FTE, S curve, etc.). You intend on using the historical data first as a means to compare against the new estimate and secondly as rationale substantiate the estimate.
Determine the estimating structure
The customer has indicated in the scope document that waterfall as the development methodology. The next couple of hours involve decomposing requirements into an activity list, of which is entered into a spreadsheet. An extra column is added for the estimating effort and then sent to the printer. You hear a knock on the door and it is Sharon letting you know the pizza has arrived.
Develop the point estimate
As the team stuffs there face with pizza, you lay out the estimation sheets on round table and then begin to write on the white board.
- Round 1:
- Initially discuss activity: Quick Q&A
- Write single point estimate (DO NOT ANNOUNCE ESTIMATE)
- Collect sheets at the end
- 20 min intermission; data will be compiled to develop the three point estimate
- Display results
- Identify activities with large variance and standard deviation
- Conclusively discuss activities: Quick Q&A
- Adjust single point estimate if needed (DO NOT ANNOUNCE ESTIMATE)
- Collect sheets at the end
- 10 min intermission; data will be compiled to develop the three point estimate
- Display results
- Compare the results to the historical data
- Finalize the estimate
- Document assumptions and rationale
Identifying ground rules and assumptions
The team is done eating and they settle in by read the white board. You announce that individual estimates will not be shared during the process to protect against groupthink, Also to ignore any other psychological factors that tend to skew or anchor the estimate. The goal is to develop a pure estimate of the scope. Now let’s begin.
Estimating a project is hard work. The estimation process is not an exact science. If you exactly execute against an original plan, well you are lucky. It is important to know the value of the estimation process and to continuously improve based on lessons learned.
Please share your experiences.
GAO Cost Estimating and Assessment Guide: http://www.gao.gov/new.items/d093sp.pdf
Fantastic article discussing using the GAO Cost Estimating Guide (don’t forget OMB A-94), in addition to using parametric estimates, analogous estimates, and the integration of WBS by feds and their contractors for cost realism and cost effectiveness budget creation is showing a great roadmap for best practices and simple, yet effective tools for the Holy Grail of cost estimates: Should Cost.
These techniques should be standard, but some feds (and their low-bid contractors), do not have the sophistication for this type of analysis and detail. These techniques I have used in the past successfully, especially on DoD IT systems, and they work.
The use of function points counts is also important for software development, so don’t forget those estimates into the overall analysis as well.
Fantastic article Travis! I don’t like customers forcing a methology on their contractors, but that happens. I like the estimation rounds you laid out. Also, I’ve learned I need to add additional time/cost when new technologies are involved for the learning curve. That doesn’t seem to come out in estimates naturally.
The technique that you are describing is a modification of the Delphi method. It works best when you are limited to using “expert opinions”. It is a reasonable, but not a great method. As you mention, it is used often in software estimating where it seems to be almost as accurate as parametric methods without anywhere near the invested overhead that parametric methods entail. I would add a couple of extra rules to make it work better. First, try to break the estimation problem into as many similarly sized activities as possible. It works best if all activities are of a reasonably close size. For example, don’t let one step be “create database using oracle” and others being “calculate external temperature.” The first can take months and the last just minutes. Second, let more time between rounds traspire than just the few minutes in a meeting. You are adding a bias of “get done to get out of here” to your estimation process. In the actual Delphi method, the experts are allowed to know who is on the team, but they are not allowed to know who proposed what number or who is proposing a particular argument for a number. The bias that you are trying to avoid is the “heavy hand of the revered expert”. If your team of experts are all of about the same level, then the method you described may work well enough. however, if you start getting more “foremost experts in yadda ya” in the room, then you need to improve on the anonimity of the team members. That usually means going slower and passing information through a broker.
Hey thanks all for the feedback. I agree that we must protect against estimation bias and anchoring. I am curious how you apply factors to estimates such as learning curve or complexity. Is there a standard heuristic or guidance document out there for such factors?
Learning curves are usually applied in a production setting. There are cases where a service could be a repeatable and complex process (something necessary for learning curves to apply). An example of a service that usually sees a significant learning curve is a help desk for a new product. The first few days are the most hectic and then it drops off at a steep and usually predictable level. At some point it gets close to a level amount of activity. Rarely will the calls drop off completely.
In the hardware acquisition business we have several methods. Usually you start with an analogous system for which you have a known historical trend. There are then methods for adjusting the rate and the theoretical first unit cost (T1) so that you have something to base the new costs on. As with most learning curves, it is far more important to use during the early production runs. That occurs frequently in DoD buys. It is not as common and not as useful in things that are primarily commercial buys.
Good information on this can be found in the GAO Cost Estimating and Assessment Guide (Mar 2009).