P. 1
Function Point Estimation

Function Point Estimation

|Views: 23|Likes:
Publicado porPavan Kumar R
A method to estimate function points comprehensively.
A method to estimate function points comprehensively.

More info:

Categories:Types, Presentations
Published by: Pavan Kumar R on Dec 03, 2013
Direitos Autorais:Attribution Non-commercial

Availability:

Read on Scribd mobile: iPhone, iPad and Android.
download as PPT, PDF, TXT or read online from Scribd
See more
See less

06/18/2014

pdf

text

original

PROJECT SIZING AND ESTIMATING EFFECTIVELY USING FUNCTIONAL MEASUREMENT

Southern California Software Process Improvement Network

www.davidconsultinggroup.com

1

Topics
• • • • • • • • The User Perspective Program Start-up Characteristics of an Effective Sizing Metric Use of Function Points Project Estimation Quantitative & Qualitative Assessments Establishing and Using Baseline Data Modeling Improved Performance

Copyright © 2005. The David Consulting Group, Inc.

2

The User Perspective
“How do we indicate the value of sizing to our users/customers? When we say that a project represents 50 Function Points from the 'users perspective‘, what does this really mean to the user? or to the developer? How do we engage both parties in understanding size?

• Functional ‘value’ delivered to the user/customer
• Comparative analysis (size, cost per unit of work, defects)

• Functional description and accountability (user) • Management of delivery expectations • Credibility in project estimation

Copyright © 2005. The David Consulting Group, Inc.

3

The David Consulting Group. Inc. 4 .Program Start-up • Planning • • • • How will the information be used (objectives)? Who is counting? What is being counted? History versus industry? • Path of least resistance • Easy to count (transaction based) • Agreeable user • Culture • Internal versus external • Pilot/rollout versus organization-wide • Scope (what doesn’t get counted) • Continually evaluating the program Copyright © 2005.

Inc. 5 . Start-up and Ongoing • Less than 1% of total budget (labor) • Internal • • • • Training (several days) Mentoring Frequency/ skill level User group knowledge sharing • External • Economies of scale • Awareness & orientation • Internal resources required • Positioned for Success • Other measures to be utilized • Process improvement activities monitored and measured Copyright © 2005.Costs/Required Resources. The David Consulting Group.

6 .When to Size 1 DEFINE 2 DESIGN BUILD TEST 3 IMPLEMENT SIZING SIZING SIZING 1) Initial sizing during or after Requirements Phase 2) Subsequent sizing after System Design or when Change occurs 3) Final sizing after Install Copyright © 2005. Inc. The David Consulting Group.

Characteristics of an Effective Sizing Metric • Meaningful to developer and user/customer • Defined (industry recognized) • Consistent (methodology) • Easy to learn and apply • Accurate. Inc. 7 . statistically based • Available when needed (early) • Addresses project level information needs Copyright © 2005. The David Consulting Group.

Inc.Function Points .An Effective Sizing Metric Function Point Analysis is a standardized method for measuring the functionality delivered to an end user. Benefits: • Quantitative (Objective) Measure • Industry Data as Basis for Comparison • Expectations (Perceived Customer Value) Managed • Software Process Improvement Requirements Satisfied Copyright © 2005. The David Consulting Group. 8 .

. Inc. The David Consulting Group.Benefits of Using Function Points • A vehicle to estimate cost and resources required for software development. enhancements and/or maintenance A tool to quantify performance levels and to monitor progress made from software process improvement initiatives A tool to determine the benefit of an application to an organization by counting functions that specifically match requirements A tool to size or evaluate purchased application packages 9 • • • Copyright © 2005.

The David Consulting Group.Approach to Function Points • A function point count is performed to produce a functional size measure • The size can be used to generate project estimates • Estimates should be based upon delivery rates • Analysis . Inc.plan versus actual comparisons • How good is the information received during requirements? • How good (accurate) is project estimating? Copyright © 2005. 10 .

Inc.Function Point Counting Process • • Review the available documentation Meet with SME to gain a thorough understanding of the functionality Apply the function point methodology. The David Consulting Group. and compute a functional size • • Generate an estimate based on available delivery rates Copyright © 2005. 11 .

The Function Point Methodology Five key components are identified based on logical user view • • • • • Inputs Outputs Inquiries Internal Logical Files External Interface Files Input Inquiry Output Internal Logical Files External Interface File Application Copyright © 2005. 12 . The David Consulting Group. Inc.

Inc. 13 .Logical View of User Requirements Inquiries USER LIST OF MOLDS USER WORK CENTERS VENDOR INFORMATION Interface VENDOR SUPPLY PARTS PLANT MOLDS Output PARTS LISTING USER ORDER PARTS Internal Logical Files BILL OF MATERIALS PLANT INFORMATION CENTER Inputs USER CHANGE BILL Copyright © 2005. The David Consulting Group.

The David Consulting Group. __ x 10 __ x 7 __ x 4 __ x 5 __ x 4 High __ x 15 __ x 10 __ x 6 __ x 7 __ x 6 Total ___ ___ 3 ___ ___ ___ 3 ___ Total Unadjusted FPs Data Relationships Record Element Types or File Types Referenced Data Elements (# of unique data fields) Low Low Average Low Average High Average High High Copyright © 2005. 14 .The Function Point Methodology Each identified component is assigned a Function Point size value based upon the make-up and complexity of the data Complexity Components: Internal Logical File (ILF) External Interface File (EIF) External Input (EI) External Output (EO) External Inquiry (EQ) Low __ x 7 __ x 5 1 x3 __ __ x 4 __ x 3 Avg . Inc.

The Function Point Methodology 14 Optional General Systems Characteristics are evaluated and used to compute a Value Adjustment Factor (VAF) General System Characteristics Data Communication Distributed Data Processing Performance Objectives Heavily Used Configuration Transaction Rate On-Line Data Entry End-User Efficiency On-Line Update Complex Processing Reusability Conversion & Install Ease Operational Ease Multiple-Site Use Facilitate Change The final calculation is based upon the Unadjusted FP count X VAF Copyright © 2005. 15 . The David Consulting Group. Inc.

01 Total Adjusted FPs: 21 Copyright © 2005.Function Point Calculation Enhancement FPs as they relate to existing master count: • External Inputs (EI) (2) – Add/Change Account. change. The David Consulting Group. high complexity. 16 . low complexity. total unadjusted FPs = 1 x 6 = 6 • External Input (EI) – Add Tax. total unadjusted FPs = 1 x 3 = 3 Total Unadjusted FPs: 21 Value Adjusted Factor: 1. total unadjusted FPs = 2 x 6 = 12 • External Input (EI) –Issue Material. Inc. high complexity. change. change.

17 .Example – Counting Accuracy Estimate: • Resulting size = 150 function points • Matching profile. Inc. The David Consulting Group. rate of delivery = 10 FP/EM • Estimated effort = 15 effort months Actuals: Size 175 fps Effort 19 effort months Results +17% +27% Copyright © 2005.

Inc.Example – Scope Accuracy Counting Activity • Resulting Requirements size = • Resulting Design size = • Resulting Install size = FP 120 144 174 Analysis: Inputs Outputs Inquiries Interfaces Files 75 10 15 7 13 80 25 17 7 15 95 40 17 7 15 Copyright © 2005. The David Consulting Group. Total 120 144 174 18 .

Inc.Project Estimation DEFINITION CAPABILITY ESTIMATE Schedule REQUIREMENT PROJECT SIZE X PROJECT COMPLEXITY X RISK FACTORS Effort Costs FUNCTION POINT ANALYSIS Copyright © 2005. The David Consulting Group. 19 .

g.Capability Analysis • Collect project data • • • • Project metrics (e. size. tools. cost. The David Consulting Group. 20 ... effort. etc. defects) Project characteristics Project attributes (e. duration. Inc. skill levels.) Project complexity variables • Analyze data • Performance comparisons (identification of process strengths and weaknesses) • Industry averages and best practices • Performance modeling (identify high impact areas) Copyright © 2005. process.g.

Inc. The David Consulting Group. 21 .DCG Data Base Characteristics Project Type Platform Data Base Method Language Complexity Variables Logical Algorithms Mathematical Algorithms Data Relationships Functional Size Reuse Code Structure Performance Memory Security Warranty Metrics Size Cost Effort Duration Defects Management Definition Design Build Test Environment Attributes Process Skill Levels Quality Practices Measures Copyright © 2005.

The David Consulting Group.Quantitative & Qualitative Assessments Research MEASURES CHARACTERISTICS Software Size Level of Effort Time to Market Delivered Defects Cost Skill Levels Automation Process Management User Involvement Environment Analysis PERFORMANCE LEVELS PROFILES Results • Correlate Performance Levels to Characteristics • Substantiate Impact of Characteristics • Identify Best Practices 22 Copyright © 2005. Inc. .

23 . Inc. The David Consulting Group.Estimating Using Historical Delivery Rates DEFINITION CAPABILITY ESTIMATE Schedule REQUIREMENT PROJECT SIZE and COMPLEXITY RATE OF DELIVERY Effort Costs FUNCTION POINT SIZE FUNCTION POINTS Per EFFORT MONTH Copyright © 2005.

The David Consulting Group. 24 .Analysis of Results • Analyze estimating accuracy • Plan vs. Inc. actual comparisons • Effectiveness of delivery rates • Evaluate the system level documentation • Change in scope (size) through the various stages • Clarity of requirements and design documents • Recommend improvements • Improve the level of documentation for more accurate sizing • Establish a more effective estimating practice Copyright © 2005.

Duration .Number of days Level of Effort Defects SIZE A B C D : 136 276 435 558 759 PROJECT MEASURES PROFILES 10 mnths 35 effort mnths 10 defects Rate of Delivery Time to Market Defect Density Copyright © 2005.Develop a Baseline of Data Product Deliverable C B Performance D D Profiles Size Platform Language A Time to Deliver . 25 . The David Consulting Group. Inc.

Inc. The David Consulting Group. 26 .Establish A Baseline Size is expressed in terms of functionality delivered to the user Software Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 0 2 4 Performance Productivity A representative selection of projects is measured Organizational Baseline 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery Function Points per Person Month Rate of delivery is a measure of productivity Copyright © 2005.

27 .Compare To Industry Benchmarks Industry baseline performance 2200 2000 1800 1600 1400 Software 1200 Size 1000 800 600 400 200 0 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery Function Points per Person Month Copyright © 2005. The David Consulting Group. Inc.

28 . Inc.Function Points Per Person Month Average of Recent Projects Across Different Platforms Client Server Main Frame Web e-business Web Vendor Packages Data Warehouse 15 13 22 12 19 11 Copyright © 2005. The David Consulting Group.

Function Points Supported By One FTE Average of Support Provided for Corrective Maintenance by One FTE Client Server Main Frame AS 400 Web e-business Web Vendor Packages Data Warehouse 642 943 597 748 464 760 546 Copyright © 2005. The David Consulting Group. 29 . Inc.

. Inc.Analyze Results COLLECT QUANTITATIVE DATA COLLECT QUALITATIVE DATA Collection Size Effort Duration Cost Quality Measured Performance Process Methods Skills Tools Management Capability Profiles Analysis Results Action Baseline Performance Opportunities For Improvement Best Practices 30 Copyright © 2005. The David Consulting Group.

productivity and quality • Create an atmosphere of measuring performance • Opportunity for comparison to industry best practices Copyright © 2005.Model Performance • Develop parametric models that utilize historical data points for purposes of analyzing the impact of selected process improvements • Provide a knowledge base for improved decision making • Identify areas of high impact (e. Inc. The David Consulting Group..g. 31 .

cost per functional size.Quantitative Performance Evaluation Quantitative Assessment COLLECT QUANTITATIVE DATA Size Effort Duration Cost Quality Measured Performance • Perform functional sizing on all selected projects.9 $939 0. including functional size delivered per staff month.0301 Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Copyright © 2005. Results Baseline Productivity 133 10. duration and quality. The David Consulting Group. and defects delivered. time to market. 32 . Inc. cost. • Collect data on project level of effort.7 6. • Calculate productivity rates for each project.

54 43.2 22.73 0.8 47.77 38.46 38.00 25.50 37.82 82.5 49.46 50.00 29.82 36.00 46.21 43.18 56.72 48.00 0.05 48.00 18.00 11.77 34.3 27.38 53.46 15.46 46.77 26.15 Copyright © 2005.38 43.00 37.6 32.00 50.62 53.64 22.00 52.15 38.41 74.75 6.25 53.82 29.38 9.73 50.92 53.55 31.36 43.79 48.46 26.59 53.7 17.27 20.31 42.45 0.6 23.1 17.72 43.08 71.00 0. 33 .64 54.36 0.31 42.09 0.77 53.36 38.72 56.08 28.00 42. Results Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional .2 29.82 31.00 0.5 44.75 9.59 23.92 38.31 30.92 30.85 43.45 56.55 38.31 42. • Develop Performance Profiles to display strengths and weaknesses among the selected projects.13 0.82 22.75 0.3 29.13 50.85 26.East Regional .46 42.00 0. • Collect Project Profile information.85 30. The David Consulting Group.59 23.85 11.00 50.50 53.0 40.85 38.31 30.6 40.55 31.00 38.77 26.Qualitative Performance Evaluation COLLECT QUALITATIVE DATA Process Methods Skills Tools Management Capability Profiles Qualitative Assessment • Conduct Interviews with members of each project team.0 49.00 34.3 22. Inc.West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade 55.73 45.92 38.46 30.

05 55.0 40.77 53.46 65.13 20.00 57.50 37.77 38.45 0.85 26.73 49.0301 Process Improvements: • Code Reviews and Inspections • Requirements Management • Defect Tracking Configuration Management Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Performance Improvements: Productivity ~ +131% Time to Market ~ -49% Defect Ratio ~ -75% Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional .00 0.00 54.36 52.75 6.92 38.25 53.59 27.38 53.31 49.15 Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Productivity Improvement 133 24.92 38.92 49.46 15.85 43.38 9.45 0.31 49.6 60.00 11.38 53.38 43.36 43.00 42.77 63.82 55.05 48.46 42.77 34.92 49.73 45.6 67.46 30.31 49.15 45.00 50.85 38.6 32.82 43.5 79.46 46.77 49.46 50.00 0.00 12.13 0.75 9.92 30.73 57.00 24.77 26.55 53.00 18.West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade 75.92 53.55 38.2 22.08 32.50 63.31 42.18 66.85 63.77 26.46 60.72 48.00 38.46 56.00 50.77 33.82 82.2 50.72 76.54 43.50 67.38 53.31 42.82 36.59 53.75 22.00 29.45 56.21 43.62 53.6 40.9 $939 0.09 0.3 69.82 41.31 51.85 58.72 43.31 42.54 53.64 64.00 49.59 69.25 53.00 32.77 52.00 60.82 59.72 51.73 0.00 49.85 31.46 50.31 30.00 52.2 29.15 Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP Baseline Productivity 133 10.00 0.3 22.59 23.8 47.08 71.1 17.00 50.72 65.5 49.0 59.6 52.00 0.27 20.79 78.75 0.6 23.46 26.46 38.50 53.13 54.0 79.31 50.73 50.64 27.79 48.45 49.East Regional .36 38.00 58.85 31.41 74.77 26.72 56.38 19.55 31.75 26.7 17.46 36.7 6.36 23.0 49.3 52.8 61.75 9.08 71. The David Consulting Group.00 25.3 29.18 56.92 53.3 27.7 52. Inc.64 22.27 47.36 55.62 53.59 30.00 62.36 0.8 3.31 30.00 0.00 0.5 $467 0.46 42.82 31.00 0.55 68.13 50.00 18.1 67.46 46.West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade 55.00 46.5 44.73 0.5 74.55 49.00 37.0075 Copyright © 2005.09 21.82 29.85 11.82 22.21 49.East Regional .Modeled Improvements Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional .15 38.85 30.59 23.3 57.41 74.08 28.00 34.2 57.92 30.55 31.85 50.82 82.64 54. 34 .00 25.

Inc. The David Consulting Group.Conclusions • • • • • • Project Management can be successful Requirements can be managed Projects can be sized Performance can be successfully estimated Process improvement can be modeled Measurement can be accomplished Copyright © 2005. 35 .

36 .com David Garmus dg@davidconsultinggroup.Contact Information David Consulting Group web site: www. Inc.davidconsultinggroup.com Copyright © 2005. The David Consulting Group.

Inc.com Copyright © 2005.com Software Engineering Institute (SEI) www.sqe.com Quality Assurance Institute (QAI) www.edu Software Quality Engineering (SQE) www.org Practical Software and Systems Measurement (PSM) www.psmsc.cmu.Contact Information International Function Point Users Group (IFPUG) www.ifpug.qaiusa.sei. 37 . The David Consulting Group.

You're Reading a Free Preview

Descarregar
scribd
/*********** DO NOT ALTER ANYTHING BELOW THIS LINE ! ************/ var s_code=s.t();if(s_code)document.write(s_code)//-->