PROJECT SIZING AND ESTIMATING EFFECTIVELY USING FUNCTIONAL MEASUREMENT

Southern California Software Process Improvement Network

www.davidconsultinggroup.com

1

Topics
• • • • • • • • The User Perspective Program Start-up Characteristics of an Effective Sizing Metric Use of Function Points Project Estimation Quantitative & Qualitative Assessments Establishing and Using Baseline Data Modeling Improved Performance

Copyright © 2005. The David Consulting Group, Inc.

2

The User Perspective
“How do we indicate the value of sizing to our users/customers? When we say that a project represents 50 Function Points from the 'users perspective‘, what does this really mean to the user? or to the developer? How do we engage both parties in understanding size?

• Functional ‘value’ delivered to the user/customer
• Comparative analysis (size, cost per unit of work, defects)

• Functional description and accountability (user) • Management of delivery expectations • Credibility in project estimation

Copyright © 2005. The David Consulting Group, Inc.

3

The David Consulting Group.Program Start-up • Planning • • • • How will the information be used (objectives)? Who is counting? What is being counted? History versus industry? • Path of least resistance • Easy to count (transaction based) • Agreeable user • Culture • Internal versus external • Pilot/rollout versus organization-wide • Scope (what doesn’t get counted) • Continually evaluating the program Copyright © 2005. Inc. 4 .

The David Consulting Group.Costs/Required Resources. 5 . Start-up and Ongoing • Less than 1% of total budget (labor) • Internal • • • • Training (several days) Mentoring Frequency/ skill level User group knowledge sharing • External • Economies of scale • Awareness & orientation • Internal resources required • Positioned for Success • Other measures to be utilized • Process improvement activities monitored and measured Copyright © 2005. Inc.

Inc.When to Size 1 DEFINE 2 DESIGN BUILD TEST 3 IMPLEMENT SIZING SIZING SIZING 1) Initial sizing during or after Requirements Phase 2) Subsequent sizing after System Design or when Change occurs 3) Final sizing after Install Copyright © 2005. The David Consulting Group. 6 .

Characteristics of an Effective Sizing Metric • Meaningful to developer and user/customer • Defined (industry recognized) • Consistent (methodology) • Easy to learn and apply • Accurate. statistically based • Available when needed (early) • Addresses project level information needs Copyright © 2005. The David Consulting Group. 7 . Inc.

Benefits: • Quantitative (Objective) Measure • Industry Data as Basis for Comparison • Expectations (Perceived Customer Value) Managed • Software Process Improvement Requirements Satisfied Copyright © 2005. 8 . The David Consulting Group.An Effective Sizing Metric Function Point Analysis is a standardized method for measuring the functionality delivered to an end user.Function Points . Inc.

Benefits of Using Function Points • A vehicle to estimate cost and resources required for software development. . Inc. The David Consulting Group. enhancements and/or maintenance A tool to quantify performance levels and to monitor progress made from software process improvement initiatives A tool to determine the benefit of an application to an organization by counting functions that specifically match requirements A tool to size or evaluate purchased application packages 9 • • • Copyright © 2005.

Approach to Function Points • A function point count is performed to produce a functional size measure • The size can be used to generate project estimates • Estimates should be based upon delivery rates • Analysis . Inc. The David Consulting Group.plan versus actual comparisons • How good is the information received during requirements? • How good (accurate) is project estimating? Copyright © 2005. 10 .

Function Point Counting Process • • Review the available documentation Meet with SME to gain a thorough understanding of the functionality Apply the function point methodology. The David Consulting Group. and compute a functional size • • Generate an estimate based on available delivery rates Copyright © 2005. 11 . Inc.

The Function Point Methodology Five key components are identified based on logical user view • • • • • Inputs Outputs Inquiries Internal Logical Files External Interface Files Input Inquiry Output Internal Logical Files External Interface File Application Copyright © 2005. The David Consulting Group. Inc. 12 .

13 .Logical View of User Requirements Inquiries USER LIST OF MOLDS USER WORK CENTERS VENDOR INFORMATION Interface VENDOR SUPPLY PARTS PLANT MOLDS Output PARTS LISTING USER ORDER PARTS Internal Logical Files BILL OF MATERIALS PLANT INFORMATION CENTER Inputs USER CHANGE BILL Copyright © 2005. The David Consulting Group. Inc.

14 .The Function Point Methodology Each identified component is assigned a Function Point size value based upon the make-up and complexity of the data Complexity Components: Internal Logical File (ILF) External Interface File (EIF) External Input (EI) External Output (EO) External Inquiry (EQ) Low __ x 7 __ x 5 1 x3 __ __ x 4 __ x 3 Avg . __ x 10 __ x 7 __ x 4 __ x 5 __ x 4 High __ x 15 __ x 10 __ x 6 __ x 7 __ x 6 Total ___ ___ 3 ___ ___ ___ 3 ___ Total Unadjusted FPs Data Relationships Record Element Types or File Types Referenced Data Elements (# of unique data fields) Low Low Average Low Average High Average High High Copyright © 2005. Inc. The David Consulting Group.

15 . The David Consulting Group.The Function Point Methodology 14 Optional General Systems Characteristics are evaluated and used to compute a Value Adjustment Factor (VAF) General System Characteristics Data Communication Distributed Data Processing Performance Objectives Heavily Used Configuration Transaction Rate On-Line Data Entry End-User Efficiency On-Line Update Complex Processing Reusability Conversion & Install Ease Operational Ease Multiple-Site Use Facilitate Change The final calculation is based upon the Unadjusted FP count X VAF Copyright © 2005. Inc.

total unadjusted FPs = 1 x 6 = 6 • External Input (EI) – Add Tax. 16 . high complexity. change. high complexity. low complexity.Function Point Calculation Enhancement FPs as they relate to existing master count: • External Inputs (EI) (2) – Add/Change Account. Inc.01 Total Adjusted FPs: 21 Copyright © 2005. The David Consulting Group. change. total unadjusted FPs = 1 x 3 = 3 Total Unadjusted FPs: 21 Value Adjusted Factor: 1. change. total unadjusted FPs = 2 x 6 = 12 • External Input (EI) –Issue Material.

17 . The David Consulting Group. rate of delivery = 10 FP/EM • Estimated effort = 15 effort months Actuals: Size 175 fps Effort 19 effort months Results +17% +27% Copyright © 2005. Inc.Example – Counting Accuracy Estimate: • Resulting size = 150 function points • Matching profile.

Total 120 144 174 18 .Example – Scope Accuracy Counting Activity • Resulting Requirements size = • Resulting Design size = • Resulting Install size = FP 120 144 174 Analysis: Inputs Outputs Inquiries Interfaces Files 75 10 15 7 13 80 25 17 7 15 95 40 17 7 15 Copyright © 2005. Inc. The David Consulting Group.

The David Consulting Group. Inc.Project Estimation DEFINITION CAPABILITY ESTIMATE Schedule REQUIREMENT PROJECT SIZE X PROJECT COMPLEXITY X RISK FACTORS Effort Costs FUNCTION POINT ANALYSIS Copyright © 2005. 19 .

. effort. The David Consulting Group. skill levels.) Project complexity variables • Analyze data • Performance comparisons (identification of process strengths and weaknesses) • Industry averages and best practices • Performance modeling (identify high impact areas) Copyright © 2005. defects) Project characteristics Project attributes (e. Inc. process. duration. cost.g. size.g. tools. etc.Capability Analysis • Collect project data • • • • Project metrics (e. 20 ..

DCG Data Base Characteristics Project Type Platform Data Base Method Language Complexity Variables Logical Algorithms Mathematical Algorithms Data Relationships Functional Size Reuse Code Structure Performance Memory Security Warranty Metrics Size Cost Effort Duration Defects Management Definition Design Build Test Environment Attributes Process Skill Levels Quality Practices Measures Copyright © 2005. 21 . Inc. The David Consulting Group.

Inc.Quantitative & Qualitative Assessments Research MEASURES CHARACTERISTICS Software Size Level of Effort Time to Market Delivered Defects Cost Skill Levels Automation Process Management User Involvement Environment Analysis PERFORMANCE LEVELS PROFILES Results • Correlate Performance Levels to Characteristics • Substantiate Impact of Characteristics • Identify Best Practices 22 Copyright © 2005. The David Consulting Group. .

Estimating Using Historical Delivery Rates DEFINITION CAPABILITY ESTIMATE Schedule REQUIREMENT PROJECT SIZE and COMPLEXITY RATE OF DELIVERY Effort Costs FUNCTION POINT SIZE FUNCTION POINTS Per EFFORT MONTH Copyright © 2005. The David Consulting Group. 23 . Inc.

The David Consulting Group. Inc. actual comparisons • Effectiveness of delivery rates • Evaluate the system level documentation • Change in scope (size) through the various stages • Clarity of requirements and design documents • Recommend improvements • Improve the level of documentation for more accurate sizing • Establish a more effective estimating practice Copyright © 2005.Analysis of Results • Analyze estimating accuracy • Plan vs. 24 .

The David Consulting Group.Duration . Inc.Develop a Baseline of Data Product Deliverable C B Performance D D Profiles Size Platform Language A Time to Deliver . 25 .Number of days Level of Effort Defects SIZE A B C D : 136 276 435 558 759 PROJECT MEASURES PROFILES 10 mnths 35 effort mnths 10 defects Rate of Delivery Time to Market Defect Density Copyright © 2005.

The David Consulting Group. 26 .Establish A Baseline Size is expressed in terms of functionality delivered to the user Software Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 0 2 4 Performance Productivity A representative selection of projects is measured Organizational Baseline 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery Function Points per Person Month Rate of delivery is a measure of productivity Copyright © 2005. Inc.

Inc.Compare To Industry Benchmarks Industry baseline performance 2200 2000 1800 1600 1400 Software 1200 Size 1000 800 600 400 200 0 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery Function Points per Person Month Copyright © 2005. The David Consulting Group. 27 .

The David Consulting Group.Function Points Per Person Month Average of Recent Projects Across Different Platforms Client Server Main Frame Web e-business Web Vendor Packages Data Warehouse 15 13 22 12 19 11 Copyright © 2005. Inc. 28 .

The David Consulting Group. Inc.Function Points Supported By One FTE Average of Support Provided for Corrective Maintenance by One FTE Client Server Main Frame AS 400 Web e-business Web Vendor Packages Data Warehouse 642 943 597 748 464 760 546 Copyright © 2005. 29 .

The David Consulting Group.Analyze Results COLLECT QUANTITATIVE DATA COLLECT QUALITATIVE DATA Collection Size Effort Duration Cost Quality Measured Performance Process Methods Skills Tools Management Capability Profiles Analysis Results Action Baseline Performance Opportunities For Improvement Best Practices 30 Copyright © 2005. Inc. .

Model Performance • Develop parametric models that utilize historical data points for purposes of analyzing the impact of selected process improvements • Provide a knowledge base for improved decision making • Identify areas of high impact (e. The David Consulting Group. productivity and quality • Create an atmosphere of measuring performance • Opportunity for comparison to industry best practices Copyright © 2005. Inc. 31 .g..

• Collect data on project level of effort. and defects delivered.Quantitative Performance Evaluation Quantitative Assessment COLLECT QUANTITATIVE DATA Size Effort Duration Cost Quality Measured Performance • Perform functional sizing on all selected projects.9 $939 0. cost. time to market. • Calculate productivity rates for each project. Results Baseline Productivity 133 10. duration and quality. cost per functional size.7 6.0301 Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Copyright © 2005. Inc. 32 . including functional size delivered per staff month. The David Consulting Group.

59 23.73 0.77 26.00 11.00 50.00 34.54 43.3 29.3 22.00 0.05 48.55 31.00 25.41 74.77 26.92 38.00 0.92 53.21 43.85 30.55 31.79 48.46 42. 33 . The David Consulting Group.25 53.31 30.75 0.6 32.0 49. Inc.75 6.75 9.45 56.62 53.31 42.38 9.Qualitative Performance Evaluation COLLECT QUALITATIVE DATA Process Methods Skills Tools Management Capability Profiles Qualitative Assessment • Conduct Interviews with members of each project team.3 27.00 18.92 38.00 46.36 43.13 0.00 37.31 42.82 31.36 0.77 34.08 28.82 22.55 38.00 29.64 54.2 29.73 45.82 36.92 30.72 43.46 46. Results Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional .8 47.0 40.46 15.08 71.46 26.72 56.5 49.38 43. • Develop Performance Profiles to display strengths and weaknesses among the selected projects.46 50.1 17.77 38.59 23.15 Copyright © 2005.64 22.36 38.50 37.59 53.13 50.77 53.00 42.85 26.00 0.27 20.85 11.31 30.West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade 55.85 43.73 50.09 0.46 30.00 0.18 56.East Regional .50 53.00 38.00 52.85 38.5 44.72 48. • Collect Project Profile information.46 38.38 53.7 17.82 82.00 50.6 40.82 29.6 23.15 38.45 0.31 42.2 22.

72 56.7 52.31 30.50 53.05 48.31 51.85 26.2 50.00 0.31 42.13 50.38 53.45 0.0 40.00 0.77 33.0301 Process Improvements: • Code Reviews and Inspections • Requirements Management • Defect Tracking Configuration Management Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Performance Improvements: Productivity ~ +131% Time to Market ~ -49% Defect Ratio ~ -75% Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional .00 29.2 22.62 53.00 60.00 38.36 23.05 55.38 53.6 23.0075 Copyright © 2005.00 0.46 15.92 30.85 38.85 50.50 63.46 26.09 21.00 0.92 38.92 38.59 27.31 30.62 53.75 26.72 76.08 28.77 26.59 53.77 26.00 11.09 0.21 43. 34 .0 49.64 22.79 78.73 50.55 38.46 60.73 57.41 74.59 23.7 17.73 0.36 38.82 55.72 48.36 52.50 37.85 43.75 0.00 49.55 53.00 12.00 34. Inc.46 56.00 18.East Regional .25 53.00 25.46 50.08 71.72 51.08 32.5 $467 0.73 49.64 64.6 60.92 53.82 31.77 52.77 38.45 0.31 49.1 67.3 22.59 69.5 74.27 20.00 50.08 71.13 0.00 0.3 29.15 38.00 18.00 37.92 30.64 27.55 49.15 Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Baseline Productivity 133 10.38 19.00 58.75 9.36 55.5 44.31 50.1 17.00 32.3 57.85 58.79 48.West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade 55.00 49.00 24.00 0.54 43.Modeled Improvements Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional .6 32.46 38.00 62.00 50.75 22.46 30.82 82.6 67.50 67.82 59.77 63.8 3.82 36.2 29.82 43.46 42.77 34.3 52.31 42.3 27.00 52.13 20.45 56.5 49.00 0.82 29.15 45.9 $939 0.59 23.18 56.00 46.15 Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Productivity Improvement 133 24.85 11.00 42.38 9.41 74.85 31.92 49.85 30.8 47.75 9.72 43.27 47.36 0.36 43.46 42.77 53.45 49.31 49.7 6.West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade 75.38 43.00 50.3 69.46 46.6 40.75 6.77 26.85 63.82 41.72 65.13 54.46 36.21 49.46 46.59 30.85 31.55 31.46 50.East Regional .0 79.82 22.31 49.55 68.00 57.92 49.00 54.54 53.73 0.82 82. The David Consulting Group.0 59.77 49.92 53.31 42.55 31.18 66.5 79.2 57.25 53.00 25.8 61.6 52.46 65.73 45.64 54.38 53.

Conclusions • • • • • • Project Management can be successful Requirements can be managed Projects can be sized Performance can be successfully estimated Process improvement can be modeled Measurement can be accomplished Copyright © 2005. Inc. The David Consulting Group. 35 .

davidconsultinggroup. Inc. The David Consulting Group. 36 .com David Garmus dg@davidconsultinggroup.com Copyright © 2005.Contact Information David Consulting Group web site: www.

edu Software Quality Engineering (SQE) www. 37 . The David Consulting Group.psmsc.sei.ifpug.qaiusa.org Practical Software and Systems Measurement (PSM) www.com Software Engineering Institute (SEI) www.com Quality Assurance Institute (QAI) www.com Copyright © 2005.cmu. Inc.Contact Information International Function Point Users Group (IFPUG) www.sqe.

Sign up to vote on this title
UsefulNot useful