Parametric Cost Analysis

from the Perspective of Competitive Advantage

Edwin B. Dean


[NASA Logo] In her 28 August 1995 policy letter on Parametric Estimating, Ms. Eleanor Spector, Director Defense Procurement, stated:
"I fully support the use of properly calibrated and validated parametric cost estimating techniques on proposals submitted to DoD, and I encourage your enthusiatic support."
Parametric estimating is the application of equations which describe relationships between cost, schedule, and measurable attributes of a product or the system to bring forth, sustain, and retire the product. Parametric analysis is the process of determining the highly predictive equations necessary for parametric estimating. Together, parametric estimating and parametric analysis constitute parametric cost analysis.

Johnston (1960), probably the first book on parametric cost analysis, provides foundational theory, methods, and results from case studies.

Klein and Tait (1971), an early example of applied parametric cost analysis, expresses the number of tool design and tool fabrication hours per part in terms of the number of drilled and reamed holes, the volume of the piece, the number of locating points, and the complexity of part orientation. Stepwise regression was used to select these statistically significant variables for a linear equation from the eleven chosen as possible cost drivers. The authors also introduce the reality of cost uncertainty through a trade off of confidence and expected time.

Today, parametric estimating is usually applied to large systems, such as those found in the Department of Defense or NASA. Thus, parametric estimating relies on simulation models that are systems of statistically and logically supported mathematical equations. The impacts of a product's physical, performance, and programmatic characteristics on cost and schedule are defined by these equations. Tailoring parameters are used to describe the object being estimated. Output of the models is validated with data from past projects. While many parametricians use commercially available general-purpose cost models, many others have created their own models to satisfy specialized needs.

From a mathematical perspective, parametric cost analysis is the set of processes by which appropriate characteristics of systems are mapped to appropriate ranges of cost. Given these mappings, one can then estimate cost, estimate the variability of cost, or design for cost with respect to the given system characteristics.

The idea is find a value for the m parameters p = (p1 ... pm) such that the cost y can be predicted reasonably well by the equation y = f(x,p) + e where e is the prediction error and x = (x1 ... xm) is a set of measures of system characteristics which vary over n cases (yi,x1i ... xmi), different for each i. Given that f is linear in the parameters p, one such criteria for "predicted reasonably well" is least squares which minimizes the Euclidian distance between the predicted values (z1 ... zn) and the case values (y1 ... yn). The equation for calculating the values of these parameters is p = (X' X)^ X' y where X is a matrix with n rows (x1 ... xm), X' is its transpose, y is a vector with n rows yi, and (X' X)^ denotes the inverse of the matrix product X' X. Given an arbitrary x we can then predict the cost y reasonably well by z = f(x,p).

Noting that response surface methodology (RSM) is the process of finding an equation of the form z = f(x,p) + e, then the analysis component of parametric cost analysis can be viewed an application of RSM with cost as the response surface. This perspective immediately suggests more powerful second and higher order equations which should be considered for cost analysis. A second order equation will automatically capture the multicolinearity which often plagues the linear equations typically used.

If the cases have been chosen to represent a particular class of system, say a space launch vehicle, then p, the vector of parameters, is a label for that system class. The labels show up as numerical values in equations and are typically ignored. This has led to many misconceptions of cost. A scatter diagram of the labels is very revealing since it displays the clustering of classes. The parameters of the label are usually obtained by least squares, also called regression.

The trick is to choose an appropriate sets of measures (x1 ... xm) which reduce estimation error to a tolerable level. What level of estimation error is tolerable? From the typical American manager's perspective, that level is usually far smaller than reality will allow. This is a primary problem for the estimator. A second primary problem for the estimator is that the predicted value is almost always far higher than management will allow. A third primary problem is that the cost z desired by the enterprise is typically a large managed distance away from the estimated value x or the actualized value u. A forth primary problem is that the estimator is often not allowed to treat x as the random variate it is, thus the very real cost of risk can not be included.

This typically American perception by the enterprise that cost can be managed by managing the cost estimator and the estimate results in a far higher cost than originally estimated for the typical product. The increase comes from the associated budgetary stress of trying to do too much with too little.

This typically American environment arises from price competition on cost plus fee, rather than fixed price, contracts, typically for major defense and NASA systems. The reality is, that, in this environment, the only thing being negotiated is the fee. The result is the typical large contract overrun predicted by parametric cost estimating prior to establishment of price by the enterprise plus the additional cost from the budgetary stress. However, if Americans understood how to design for cost, these overuns could become underruns.

Given that the estimator and the estimate are not "managed," parametric cost analysis provides an excellent tool for estimating cost. Given an x which really represents the cost drivers, parametric cost analysis can be used to design for cost with Taguchi methods, response surface methodology, or multidisciplinary optimization. In fact, optimizing for cost illuminates a previously unrecognized form of cost driver, the constraint.

Where does one find appropriate measures x. In the past, these variables have typically been chosen to represent characteristics of the product to be estimated, such as weight for hardware, lines of code for software, or aircraft design variables for aircraft. My research indicates that more appropriate drivers come from the system to bring forth, sustain, and retire the product to be estimated. Binary variables which represent the use or non use of Taguchi methods, response surface methodology, or multidisciplinary optimization during conceptual design and design are appropriate. Other appropriate measures include the degree of enterprise use of: activity based costing, concurrent engineering, design for ..., hoshin kanri, kaizen, quality function deployment, or systems engineering. The appropriate use of such tools tends to reduce cost. In general, has the enterprise designed itself for competitive advantage?

Another important aspect of parametric cost analysis is the depiction of cost uncertainty. If anything is uncertain in this world, it is cost. For example, how much will your next car cost? It depends on when you buy it, from whom you buy it, what type of car you buy, the features of the car, and your skill at negotiation, as well as the manufacturers listed sale price. Parametric cost analysis invites the incorporation of probabilistic methods to simulate and estimate this cost uncertainty. In fact, if you use point estimates, instead of probabilistic methods, you will get a low estimate because you ignored the uncertainty.

Cost risk is the degree by which a project could overrun the agreed upon price. The degree of risk is established by the location of the price on the cost axis of the cost uncertainty curve. For example, if the price has been established at the 0.2 probability point on this curve, then the odds are 4 to 1 that the cost will be more than the price. If, in this case, the price is fixed, then the project developer is at risk and will probably lose a substantial amount. The expected loss is the difference between the expected value of the cost uncertainty distribution and the price. For cost plus fee contracts, this difference is the expected cost overrun. It is, thus, the risk of the contracting organization, typically the U. S. Government.

In the past, parametric cost analysis has been largely based upon unit costing. That means that the cost of one unit of a tangible product is estimated. It is then adjusted by a power law learning curve to estimate the total cost of all identical units purchased. It is known today that the unit cost accounting methods of the past did not represent the cost of a unit very well. Thus, parametric cost analysis based upon the old unit cost accounting techniques also contains these errors.

Activity based costing (ABC), while not perfect, does represent unit cost and lot cost much better than the old unit cost techniques. Thus, future parametric cost analysis should be based upon activity based costing. Here, the cost of an activity would be mapped to attributes of the system to bring forth, sustain, and retire the product and to attributes of the input and output of the activity. These attributes are called quality characteristics within quality function deployment. I call such models parametric activity based models.

Note that, since the activity exists within the dynamics of the system to bring forth, sustain, and retire the product, just using attributes of the specific activity is probably inadequate. Thus, the use of a cost driver specific only to the activity is a probable flaw in current activity based costing implementations. A second probable flaw in current ABC implementation is that it assumes linearity in terms of the cost driver. This may be a good approximation for lower level activities, however, higher level activities are ususally used to reduce the number of activities. Here, based upon experience, I suspect that nonlinearity is far more common than linearity. This leads to the hypothesis that parametric activity based costing would provide even better accuracy than current ABC implementations.

It is interesting to note that LeBlanc et. al. (1976) implemented parametric activity based cost estimating many years before activity based cost existed within the accounting community.

The continuing rise of the yen has made cost deployment a hot topic for research in Japan. It is exceptionally important to note that the Japanese have proposed using an elementary form of parametric cost analysis to deploy cost within the comprehensive quality function deployment process (Akao, 1994). Dean(1995b) combined that concept with the concept of parametric accounting (Dean, 1989c) to define parametric cost deployment.





Parametric Cost Analysis Bibliography
Activity Based Cost Bibliography
Least Squares Bibliography
Life Cycle Cost Bibliography
Manufacturing Cost Bibliography
Project Risk Bibliography
Theoretical Cost Analysis Bibliography
Response Surface Methodology Bibliography



International Society of Parametric Analysts
Society of Cost Estimating and Analysis
Space Systems Cost Analysis Group


Surfing the Web

Costing Tools and Models
NASA Johnson Space Center Parametric Cost Estimating Reference Manual
Parametric Cost Estimating Handbook
The information contained in this Handbook complements and enhances the guidance provided by DCAA Cost Audit Manual (CAM), FARS/DFARS, and other government regulations or public laws related to cost estimation.
What Are Function Points?


Table of Contents | Cost Technologies | Use