Home HomeContents ContentsPrev PrevNext Next

HPMD Quotes & Sources

Steve,

Very interesting article. I prefer the experience-based estimating model, based on a knowledge-base. But contrast all this rigorous methodology with the rapid prototyping PM approach of Cambridge Tech. Good food for thought as we proceed on Telekurs.

Ed

ESTIMATING SOFTWARE DEVELOPMENT PROJECTS:
NO SILVER BULLETS, BUT VENDOR SOFTWARE DOES HELP
By Stan Zawrotny, 73266,3177

A SOFTWARE DEVELOPMENT PROBLEM

Software development projects typically suffer from estimates that no one takes seriously and schedules that look as if they were developed using darts thrown at a wall calendar.

One study reports that only 5% of systems are developed on time and within budget. Another indicates that less than 1% of commercial software projects are completed on time, within budget, and according to specifications. Moreover, 75% of the projects begun are either never completed or arrive too late to be useful. A Center for Project Management in-depth study of hundreds of behind-schedule projects showed that the average project was underestimated by a factor of at least 2.85.

One product for estimating the size and cost of software development projects advertises that their product will produce estimates that are accurate to within 20% of actuals, 68% of the time. For the non-mathematically inclined, that means that about two-thirds of the time they are more than 2 months off on a 10 month project!

Is it any wonder that Gopal Kapur, President of the Center for Project Management, proclaims that information systems (IS) is the only industry where managers are embarrassed about the accuracy of their estimates?

In recent years, most corporations have found that few projects have more cost impact on the bottom line than those associated with information systems. $1 billion IS projects are not uncommon. For these (and smaller) projects, the ability to estimate, monitor, and control costs and schedules can mean the difference between making a profit and realizing a competitive advantage, or suffering public embarrassment and lost profitability through missed deadlines and uncontrolled cost growth. To reap the tremendous benefits that information technology promises, organizations need to learn how to implement it in a way that makes economic sense, with timetables and cost commitments that are prudent and realistic.

If you ask an IS manager, he or she will say that management of software development is different from other types of management, and there is some truth to that. Most IS departments handle numerous projects (sometimes hundreds at a time) of different types and sizes: development, maintenance, enhancement, support, problem reports; small, medium, large. The technology is changing almost daily. And most programmers will have you believe that they are artists, not technicians.

Yet it is not as esoteric as many believe it to be. The problems most software projects encounter -- cost overruns, schedule delays, failure to meet user needs -- are not technical ones; they result from poor judgment and bad management. That doesn't make them easier to solve, however -- management issues are almost always more difficult to resolve than technical ones.

WHY DO IS PROJECT ESTIMATES FAIL?

By definition, a manager must manipulate time and resources. The challenge facing managers today is to control costs, while gaining the maximum productivity from their resources, and in the shortest possible time. The means to accomplish this end are clear: the project manager must carefully plan and schedule the project, and monitor its progress so that corrective action can be taken when appropriate. By using more reliable estimates, the related tasks of planning and scheduling can be much more effective and less corrective action will be needed. The concept is simple but the application is often quite difficult.

Producing a realistic project estimate is a formidable task because so many factors must be considered:

KNOWLEDGE OF THE BUSINESS APPLICATION - It's evident that developers must understand the user's business to solve the problem; but planners must also understand the situation to effectively estimate the magnitude and complexity of the effort.

PROJECT SCOPE - IS projects are infamous for their changing requirements. These are usually the result of constantly changing business and technological environments, but also include "scope creep" where the users want a never-ending series of "minor changes."

CONSTRAINTS - Unrealistic staff, budget, hardware, quality, legal, and schedule constraints are a fact of life, and project managers must learn to factor them into their estimates and make the sponsors aware of their impacts. Unfortunately, there is a natural tendency to underestimate the time and resources needed. Realizing that management may refuse authorization for a project if realistic estimates are given, the inclination is to minimize the difficulties as a means of getting approval for the project. Project managers figure they can make up the time with overtime. But scheduled overtime leaves no buffer for contingencies.

COMPREHENSIVE TASK LISTS - In Managing Software Projects (QED, 1990), Lois Zells declares that the problem with software estimates "often stems from the fact that the bulk of the effort required to complete the job is simply overlooked during the estimating process. In other words, it's what is left out of the estimate that usually gets the estimators in trouble."

According to a Center for Project Management survey, typical IS projects are underplanned by a factor of 67%. Gopal Kapur maintains that the lack of a well-defined lifecycle methodology is a key factor in the undersizing and underplanning of projects.

TASK DEPENDENCIES - Incorrect dependencies among tasks often result in expensive resources sitting idle waiting for a predecessor task to end.

LABOR COSTS - Labor costs should include internal staff salaries as well as contractor personnel. If possible, "fully-loaded" salaries should be used, reflecting the costs of employee benefits as well as pure salaries.

FIXED COSTS - These usually include software licenses, hardware, travel, and supplies. Overhead such as administrative services should also be included.

RESOURCE SKILLS - Baseline estimates are usually based on the effort required for an expert to complete the tasks. Realistically, experts are not available for every task and an "average" resource might take as much as two-and-a-half times longer than an expert. The project manager must somehow balance the cost of the resources with the effect that the resources' skill-levels have on the project duration.

HISTORICAL DATA - Companies that are able to develop accurate estimates are usually those who maintain a data base of historical data. If such a data base is not available, a company can begin by using one of the industry data bases offered by several vendors of cost estimating packages. Your own company's experience should then be used to validate and update the generic history. At a minimum, the organization should record their plans' actual costs and durations and refer to that information when estimating a new project.

ANATOMY OF AN ESTIMATE

Most estimates of IS projects have three components: effort, duration, and cost:

EFFORT - Effort is the number of resource hours or days needed to accomplish an activity or task. Effort is a pure measure that doesn't consider elapsed time, dependencies, or resource availability. However, staff knowledge and experience should be taken into account -- if the baseline (expert resource) values are used, the project is almost assured of being late unless everyone on your staff is an expert.

DURATION - Duration is the allocation of effort across business or work days, based on the rate at which effort hours will be expended. Assuming the availability of resources, the duration can be paired with the dependencies to develop a tentative project schedule. Duration is not just the effort divided by the number of resources. Not every member of the staff can be kept busy all the time. Some tasks can only be performed by one person at a time, regardless of the resources available. The traditional example is that nine women cannot shorten a pregnancy to one month.

An often overlooked (or ignored) factor of duration is the cumulative sum of interruptions during the staff's work day. Although few managers are willing to accept such a high number, typically 15-35% of the work day is expended on meetings, gossip, coffee breaks, and administrative tasks. Our "8-hour days" really only produce about 6 hours of work. As mentioned earlier, managers who think they can make up the time with overtime find that they have no overtime left for contingencies. Furthermore, studies have shown that when people are scheduled for overtime on a regular basis, they subconsciously slow down to pace themselves for the longer day.

Because duration begins with an estimate of effort, staff skill levels and productivity rate are major factors affecting duration. Modern-day techniques, software engineering tools, programming languages, reusable code libraries, and so on have all been promoted as ways to increase productivity in the IS department. However, this new technology is often difficult to learn and use. Time estimates should allow new personnel time to become familiar with the technology. A person may be considered trained at the point when he or she spends only 5% of his or her time learning. With a typical learning curve, learning represents 32% of the total time until that 5% point is reached. This leaves only 68% of the learning period as productive time. Apply that 68% to the work-day that has been shortened by interruptions and you can expect to get about half a day of real work accomplished.

Unfortunately, the project manager typically does not have the luxury of increasing the estimated time by up to 100% in order to set a realistic schedule. The user will usually not accept that and, in fact, will probably want a shorter deadline. However, as Frederick Brooks wrote in The Mythical Man-Month (Addison-Wesley, 1975), "Even though the urgency of the patron may govern the scheduled completion of the task, it cannot govern the actual completion."

COST - The dollar cost of an IS project is often described in terms of the hardware, software, travel, and other fixed costs associated with the project. Don't we wish that were true? In IS projects, labor costs have been calculated as accounting for 65-85% of total project costs. Labor is harder to estimate than fixed costs, especially in IS where the development of an update screen may take an hour or a day, depending on its complexity and who is working on it.

Each resource's rate should be stated as a fully-loaded rate, including the cost of benefits. Often, for the sake of confidentiality, an average rate for the job level is used rather than an employee's actual salary.

SIZING VS. ESTIMATING

Both the project manager and the sponsor should have a clear understanding of the difference between sizing and estimating. Unfortunately, projects are usually funded based on sizing and the first major crisis occurs when the detailed estimates for cost and duration don't match the sizing expectations.

The size of the project is expressed in terms of overall effort, cost, and time to deliver and is usually derived from a formula or statistical model. Size estimates are used strategically at the proposal stage to provide essential information for decision-making.

Because so little is known about the project at this early point, size estimates usually have a high margin of error. It is here where the horror stories of projects 100-200% over budget are born. Considering those high error rates, many project managers wish they had used the product mentioned earlier that advertises estimates accurate to within 20% of actuals, 68% of the time. Regardless of the source of the size estimates, it is important that sponsors and management be aware that these are high-level approximations so that there are no unrealistic expectations.

The detailed estimates of the project are a close reflection of the actual work to be accomplished, suitable for generating a project schedule. They are usually based on a detailed work plan created from a comprehensive knowledge base such as a system development life cycle or methodology. Because they are constructed from a work breakdown structure (WBS) and use other proven project management techniques, detailed estimates can often have an accuracy of 5-15%.

Software products that create detailed estimates usually export the resulting project plans and durations to a project management scheduling and tracking package such as Microsoft Project, Applied Business Technology's Project Workbench, or Time Line from Symantec.

Similar attempts to develop schedules based on sizing information have been less successful because sizing data tends to describe the whole project rather than its parts. Yet, in a survey conducted by the Center for Project Management where 20% of the respondents said that they used formal sizing techniques for formulating high-level estimates during the pre-launch stage and 36% used "guestimates," they all said they routinely converted the sizing and guestimates into schedules.

METHODS OF ESTIMATING

In addition to such popular estimating techniques as "guestimates," WAG, SWAG, and GUT, there are four methods of creating project estimates for software projects:
- Effort Distribution
- Formula-Based Estimating
- Function Point Analysis
- Task-Based/Experience-Based Estimating

EFFORT DISTRIBUTION

The Effort Distribution Model is based on data collected from a large number of similar projects. From that data, effort distribution ratios are developed for each phase of the project life cycle. Once work has progressed with the project, the total effort can then be extrapolated from the effort expended to date.

For example, assume that it takes 145 hours to create a feasibility study for a proposed system. Using a model from the files of the Center for Project Management where the feasibility study represents 3% of the project, we can extrapolate the total project effort to be 4833 hours, since 145 is 3% of 4833. By multiplying the estimated total effort hours by the average billing rate, this can be further translated into a cost estimate.

The phases can be broken down, in turn, into ratios for each activity and the activities into ratios for the tasks. The temptation is then to apply the task effort estimates from these ratios to develop a schedule. In other words, using the above example, if it took 145 hours to complete the feasibility study, then it will take a programmer/analyst 5 hours to develop an inquiry screen for the stock clerk and that will be completed by October 18th.

Accuracy is improved as the project progresses -- after logical design is completed, a comfortable margin of error may be obtained.

Many of the sizing techniques such as COCOMO and Function Point Analysis use the effort distribution ratios to create task durations (and eventually schedules) by applying the ratios to the overall estimated size of the project. Templates of model projects are then developed with phases, activities, tasks, assigned generic resources, and relationship dependencies.

FORMULA-BASED ESTIMATING

There are a number of formula-based cost estimating techniques in the industry. The most well-known is COCOMO (COnstructive COst MOdel), developed by Barry Boehm and described in his book, Software Engineering Economics (Prentice-Hall, 1981). (Boehm virtually invented the field of software cost estimation.)

Like most of the formula-based models, COCOMO measures a piece of software by counting the source lines of code (SLOC) in the final program. There are many views about what should be counted as lines of code and what should not (e.g., do comments count?). COCOMO provides guidelines for counting lines of code to encourage some standardization across projects and across organizations.

Lines of code can work very well as a means to estimate programming effort, but they are less applicable when it comes to some other aspects of the project. There is also the question of, if the number of lines of code is based on a guess, then how accurate is the final estimate?

Typically, you'll start with only a rough description of the software system that you'll be developing, and you'll use COCOMO to give you early estimates about the proper schedule and staffing levels. As you refine your knowledge of the problem, and as you design more of the system, you can use COCOMO to produce more and more refined estimates. 1

COCOMO requires that a work breakdown structure be done prior to the estimation work. The number of lines of code is then estimated for each of the units of the work breakdown structure.

A series of weighting factors called "Cost Drivers" are then applied to the estimate. These factors are attributes of the end product, the computer used, the personnel staffing, and the project environment, which are believed to affect the project's productivity.

In the COCOMO model, one of the most important factors contributing to a project's duration and cost is the Development Mode. Every project is considered to be developed in one of three modes depending on such features as flexibility of constraints, degree of innovation required, and stability of the environment.

COCOMO is defined in terms of three different models: the Basic model, the Intermediate model, and the Detailed model. The more complex models account for more factors that influence software projects, and make more accurate estimates.

The Basic model makes its estimates of required effort based primarily on your estimate of the software project's size, measured in thousands of delivered source instructions (KDSI) or thousands of lines of code (KLOC). The Basic model is suitable for early, rough, estimates of a project's effort, duration, and cost. Generally, 60% of the time the estimates are within a factor of 2 of the actual results.

The Intermediate model provides much better estimates because you supply settings for 15 Cost Drivers that determine the effort and duration of software projects. The Cost Drivers include factors such as program complexity, programmer capability, and use of software tools. According to its developer, Barry Boehm, the Intermediate model's projections are within 20% of the actual results 68% of the time.

The only difference between the Detailed model and the Intermediate model is that the Detailed model uses different Effort Multipliers for each phase of a project. The Programmer Capability Cost Driver is a good example of a phase-dependent cost driver. A high rating for the Programmer Capability Cost Driver for one phase of a project, but a lower rating in another indicates that good programmers can save time and money on some phases of the project, but they don't have an impact on others. Even though Boehm admits that "Detailed COCOMO is not noticeably better than Intermediate COCOMO for estimating overall development effort," these phase-dependent Effort Multipliers yield better individual phase estimates than the Intermediate mode.

The Intermediate and Detailed COCOMO models both require a great many calculations. Costar (Softstar Systems), and GECOMO Plus (Marconi Systems Technology), automate the process of producing COCOMO estimates and ensure their accuracy. The two products implement both the Intermediate and the Detailed COCOMO models.

Unlike other cost estimation models, COCOMO is an open model, so all of the details are published, including all equations, assumptions, and definitions.

Because COCOMO is well defined, and because it doesn't rely upon proprietary estimation algorithms, COCOMO estimates may be more objective and repeatable than estimates made by methods relying on proprietary models. COCOMO can be calibrated to reflect your software development environment, and to produce more accurate estimates. 1

COCOMO requires that a work breakdown structure be done prior to the estimation work. The number of lines of code is then estimated for each of the units of the work breakdown structure.

A series of weighting factors called "Cost Drivers" are then applied to the estimate. These factors are attributes of the end product, the computer used, the personnel staffing, and the project environment, which are believed to affect the project's productivity.

[1Portions copyright Softstar Systems. See "Costar User's Manual", May 1986, for a further discussion of the three different models of COCOMO: the Basic, Intermediate, and Detailed models.]

FUNCTION POINT ANALYSIS

The Function Point Analysis (FPA) technique, originally developed by Alan Albrecht of IBM in the late 1970's, is a method of measuring software in terms of what the system would deliver to the user rather than lines of code or other IS-related concerns. The function point count, a figure that is independent of the programming language, programming style, and the development process as a whole, is a metric that represents the relative size of a software system. This relative size is derived from a measure of the information processing size and the technical complexity of the system.

The information processing size is determined by identifying the components of the system as seen by the end user. These components include the inputs, outputs, inquiries, external interfaces to other systems, and logical internal files, and are classified as simple, average, or complex. All of these values are then scored and the total is expressed in Unadjusted Function Points.

The Unadjusted Function Points are then adjusted by the use of complexity factors which affect the size of a system. These complexity factors include 14 General Application Characteristics such as reusability, performance, complexity of processing, etc., that can be used to weigh the unadjusted function point. The result of these computations is a number that correlates to system size.

Although the function point metric doesn't correspond to any actual physical attribute of a software system such as lines-of-code or the number of subroutines, it is quite useful as a relative measure for comparing projects, measuring productivity, and estimating the amount a development effort and time needed for a project.

In his book, Applied Software Measurement (McGraw-Hill, 1991), Capers Jones reports that his statistics show the average number of function points per person-month in the projects he has studied is 18. Although we don't know exactly what 18 function points means, Jones provides considerably detailed reports of productivity and quality data for U.S. computing in terms of function points.

Function points offer several significant advantages over lines of code counts: (1) It is possible to estimate them early in the life cycle, about the time of the requirements definition. (2) They avoid the effects of language and other implementation differences. (3) It is easier to demonstrate to the sponsor the impact that a seemingly little change to the requirements has on the project. (4) The function point count is a good basis for a quality measure. The number of bugs per function point is more meaningful than the number of bugs per 1,000 lines of code. (5) Function points can also be more useful in measuring programmer productivity. The average number of function points produced per day is more meaningful than the average number of lines of code produced per day because the latter will depend on programming language and programming style.

Proponents maintain that an early function point analysis based on a project's initial requirements definition can give developers a good first-pass estimate of its size. However, that estimate might vary by as much as plus or minus 35% or more. Because the function point counts are based on features of the system to be developed, estimates made before the logical design are really of little value. However, the accuracy improves to about 10% by the time development reaches the design definition stage.

Counting function points is not a trivial matter. Evaluating each function point for its correct complexity and size can be quite laborious. It is easy to undercount functions, and the error can have a ripple effect on the calculations. Analysts who are expected to perform FPA require extensive training. Until they become experienced, they should be assisted by an experienced consultant or other FPA expert.

There are many excellent tools to help automate FPA. MicroMan Esti-Mate (POC-IT Management Services Inc.), firstCASE (AGS Management Systems Inc.), Size Plus (Marconi Systems Technology), and Project Bridge (Applied Business Technology) are just a few of the many software development project estimating tools on the U.S. market today that incorporate some type of FPA.

There are also a number of organizations dedicated to the promotion and use of Function Points. One of these, the International Function Point Users Group (IFPUG) holds annual meetings and publishes reports on the use and application of FPA.

TASK-BASED/EXPERIENCE-BASED ESTIMATING

Task-based or experience-based estimating is a bottom-up approach to developing project estimates. With this estimating technique, the project estimate is based on the sum of its component tasks.

Beginning with the work breakdown structure from the project plan, the baseline effort for each task is determined from a knowledge database. That baseline effort is then modified to account for the resource skill levels and application knowledge. Because this approach is based on the sum of individual tasks, more specific human resource profiles can be used, leading to more accurate effort estimates at the task level. An aggregation of the effort estimates of all the tasks will then be the project's estimated effort.

Other factors such as the interruption factor discussed earlier, full- or part-time availabilities of resources, time for walk-throughs and other QA efforts, and lag time for approvals are then applied to give an estimated duration for each task. Using dependency diagrams, PERT, and/or CPM techniques, the tasks can then be organized into a potential project plan with a resultant estimated project duration.

Advantages of task-based estimating include: (1) A shorter learning curve because it is based on standard project management techniques (2) Estimates based on probable team skills rather than generic resources (3) Bottom-up planning provides better understanding of the job to be done (4) Individual developers can plan their own piece of the project, leading to better commitment by participants (5) Provides mechanism to include overhead activities such as walk-throughs, QA, and management approvals (6) Results can be used directly for producing a project schedule.

One disadvantage of bottom-up estimating is the tendency to ignore economies of scale. A bottom up estimate may have determined that five medium complexity reports, each requiring twenty hours, must be developed for a total of 100 hours. Looking at the five reports as a group, however, you may conclude that there is one medium complexity report requiring twenty hours, three very slight derivatives requiring three hours each and two more complex alterations of the basic report that require about ten hours each. The 100 hour estimate is now a 49 hour estimate. Some products such as ProjectBASE (KAPUR International, Inc.) allow the user to apportion iterative tasks such as designing screens or reports into groups according to their complexity.

Because this is a knowledge-based approach, the estimating accuracy is dependent on the quality of the knowledge base. However, the Center for Project Management reports that clients with well-defined methodologies consistently report estimating accuracy in the +15% range.

USING PROJECT ESTIMATES

Because estimating is about prediction, we must expect some inaccuracy in our estimates. We must allow for the inherent inaccuracy of the process, and expect to revise our estimates as our knowledge of the project increases.

At the outset, when only 2% of the work is completed, it is almost impossible to come up with an accurate estimate. You don't even know what the user requirements are, yet people are asking you for estimates and schedules. At this point, the estimate for a project may be as much as 80% off because so little is known about the ultimate scope of the project.

How does a project manager maintain a successful reputation against such odds?

First and foremost, you must understand that failure is a perception, not a physical reality. Regardless of whether we feel we have succeeded, if the sponsor perceives failure, then failure has, in fact, occurred. This is because failure is unmet expectations, even if those expectations were unrealistic.

Therefore, the most important defense against failure is careful guidance of sponsors' and management's expectations. Help them to understand that estimates are not exact. One way to do this is to always state your estimates as a range of values. Discuss with them the extent to which estimates can vary early in the project from later when the design starts to take shape. As new requirements are added, update your estimates and plans and communicate their impact back to the requester to update their expectations.

Secondly, use one of the project estimating software products to develop your initial estimates and, as the project progresses, replan, re-estimate, and recommunicate the results to the sponsors. The importance of using a structured approach to estimating was emphasized by Frederick Brooks in The Mythical Man-Month: "It is very difficult to make a vigorous, plausible, and job-risking defense of an estimate that is derived by no quantitative method, supported by little data, and certified chiefly by the hunches of the managers."

While any of the products listed in the accompanying chart can improve your estimating capability, Lois Zells in Managing Software Projects, cautions, "It is not unusual to find groups that are looking for some universal estimating model or formula that will work in all companies in all projects all of the time. They are looking for a way to get the numbers without thinking."

While there are no "silver bullets" that will develop project estimates at the push of a button, estimates and schedules based on a proven methodology and calibrated to the environment of your company can mean the difference between making a profit and realizing a competitive advantage, or failing miserably through missed deadlines and uncontrolled cost growth. Information technology promises to provide tremendous benefits, but IS needs to learn how to estimate the cost of those benefits and provide timetables that are realistic.

Representative IS Project Estimating Products (This is a representative list of products available in 1994. It is not intended to be complete. Contact vendors listed for current features of their products.)

Vendor/Telephone; Product; Method; Platform:

AGS Management Systems; (800) 678-8484; firstCASE; Function Point Analysis; Windows; OS/2

Applied Business Technology; (212) 219-8945; Project Bridge; Function Point Analysis; Windows

Computer Associates International; (800) 645-3003; CA-Estimacs; Proprietary (Similar to Function Point Analysis); MS-DOS

GEC Marconi Systems Technology; (703) 263-1260; GECOMO Plus; COCOMO; SunOS, HP/UX 7.0, Apollo SR/10.2, SCO UNIX, VAX/VMS, MS-DOS

GEC Marconi Systems Technology; (703) 263-1260; SIZE Plus; Function Point Analysis; VAX/VMS, UNIX

KAPUR International, Inc.; (510) 275-8000; ProjectBASE; Task/Knowledge-Based, Formula, Effort Distribution; MS-DOS

Learmonth & Burchett Management Systems (LBMS); (800) 231-7515; Project Engineer; Multiple models including Function Point Analysis and Knowledge-Based; Windows

Lucas Management Systems; (703) 222-1111; rtemis I/CSCS; Effort Distribution with Curve Fit, Knowledge-Based (Direct Input); Windows, UNIX, SunOS, MVS/TSO, VAX/VMS

Micro-Frame Technologies; (909) 983-2711; AWARD; Direct input of estimates made by Government Cost Analysis Managers; MS-DOS

Poc-It Management Services; (310) 393-4552; Esti-Mate; Function Point Analysis; MS-DOS

Quantitative Software Management; (703) 790-0055; SLIM; Proprietary Formulae; Windows

Quantitative Software Management; (703) 790-0055; Size Planner; Function Point Analysis, Fuzzy Logic, Standard Component Sizing; Windows

Softstar Systems; (603) 672-0987; Costar; COCOMO; MS-DOS, VAX

Software Productivity Research; (617) 273-0140; Checkpoint; Knowledge-Based, Function Point Analysis; Windows, UNIX/MOTIF

Welcom Software Technology; (713) 558-0514; Cobra; Multiple Forecasting Models; MS-DOS


Short Quote:

"Software development projects typically suffer from estimates that no one takes seriously and schedules that look as if they were developed using darts thrown at a wall calendar." --Stan Zawrotny
Copyright 1995, 2002, HP Management Decisions Ltd., All Rights Reserved.


Author:Zawrotny, Stan
Title:Estimating Software Development Projects
Periodical:
Volume:
Number:
Publisher:
Place (City):
Publication Date:1995
Pages:
Source Type:CompuServe
Quote Number:14
Categories:Estimating, Technology