Você está na página 1de 37

PROJECT SIZING AND ESTIMATING EFFECTIVELY USING FUNCTIONAL MEASUREMENT

Southern California Software Process Improvement Network

www.davidconsultinggroup.com

Topics
The User Perspective Program Start-up Characteristics of an Effective Sizing Metric Use of Function Points Project Estimation Quantitative & Qualitative Assessments Establishing and Using Baseline Data Modeling Improved Performance

Copyright 2005. The David Consulting Group, Inc.

The User Perspective


How do we indicate the value of sizing to our users/customers? When we say that a project represents 50 Function Points from the 'users perspective, what does this really mean to the user? or to the developer? How do we engage both parties in understanding size?

Functional value delivered to the user/customer


Comparative analysis (size, cost per unit of work, defects)

Functional description and accountability (user) Management of delivery expectations Credibility in project estimation

Copyright 2005. The David Consulting Group, Inc.

Program Start-up
Planning
How will the information be used (objectives)? Who is counting? What is being counted? History versus industry?

Path of least resistance


Easy to count (transaction based) Agreeable user

Culture
Internal versus external Pilot/rollout versus organization-wide

Scope (what doesnt get counted) Continually evaluating the program


Copyright 2005. The David Consulting Group, Inc. 4

Costs/Required Resources, Start-up and Ongoing


Less than 1% of total budget (labor) Internal
Training (several days) Mentoring Frequency/ skill level User group knowledge sharing

External
Economies of scale Awareness & orientation Internal resources required

Positioned for Success


Other measures to be utilized Process improvement activities monitored and measured
Copyright 2005. The David Consulting Group, Inc. 5

When to Size
1
DEFINE

2
DESIGN BUILD TEST

3
IMPLEMENT

SIZING

SIZING

SIZING

1) Initial sizing during or after Requirements Phase 2) Subsequent sizing after System Design or when Change occurs 3) Final sizing after Install

Copyright 2005. The David Consulting Group, Inc.

Characteristics of an Effective Sizing Metric


Meaningful to developer and user/customer Defined (industry recognized)

Consistent (methodology)
Easy to learn and apply Accurate, statistically based

Available when needed (early)


Addresses project level information needs

Copyright 2005. The David Consulting Group, Inc.

Function Points - An Effective Sizing Metric


Function Point Analysis is a standardized method for measuring the functionality delivered to an end user.
Benefits: Quantitative (Objective) Measure Industry Data as Basis for Comparison Expectations (Perceived Customer Value) Managed Software Process Improvement Requirements Satisfied

Copyright 2005. The David Consulting Group, Inc.

Benefits of Using Function Points

A vehicle to estimate cost and resources required for software development, enhancements and/or maintenance A tool to quantify performance levels and to monitor progress made from software process improvement initiatives A tool to determine the benefit of an application to an organization by counting functions that specifically match requirements A tool to size or evaluate purchased application packages
9

Copyright 2005. The David Consulting Group, Inc.

Approach to Function Points

A function point count is performed to produce a functional size measure The size can be used to generate project estimates
Estimates should be based upon delivery rates

Analysis - plan versus actual comparisons


How good is the information received during requirements? How good (accurate) is project estimating?

Copyright 2005. The David Consulting Group, Inc.

10

Function Point Counting Process

Review the available documentation Meet with SME to gain a thorough understanding of the functionality Apply the function point methodology, and compute a functional size

Generate an estimate based on available delivery rates

Copyright 2005. The David Consulting Group, Inc.

11

The Function Point Methodology


Five key components are identified based on logical user view
Inputs Outputs Inquiries Internal Logical Files External Interface Files

Input

Inquiry

Output

Internal Logical Files External Interface File

Application

Copyright 2005. The David Consulting Group, Inc.

12

Logical View of User Requirements


Inquiries
USER LIST OF MOLDS USER WORK CENTERS VENDOR INFORMATION

Interface

VENDOR SUPPLY

PARTS PLANT MOLDS

Output
PARTS LISTING USER ORDER PARTS

Internal Logical Files


BILL OF MATERIALS

PLANT INFORMATION CENTER

Inputs
USER CHANGE BILL

Copyright 2005. The David Consulting Group, Inc.

13

The Function Point Methodology


Each identified component is assigned a Function Point size value based upon the make-up and complexity of the data Complexity
Components: Internal Logical File (ILF) External Interface File (EIF) External Input (EI) External Output (EO) External Inquiry (EQ) Low __ x 7 __ x 5 1 x3 __ __ x 4 __ x 3 Avg . __ x 10 __ x 7 __ x 4 __ x 5 __ x 4 High __ x 15 __ x 10 __ x 6 __ x 7 __ x 6 Total ___ ___ 3 ___ ___ ___ 3 ___

Total Unadjusted FPs

Data Relationships

Record Element Types or File Types Referenced

Data Elements (# of unique data fields)

Low Low Average

Low Average High

Average High High

Copyright 2005. The David Consulting Group, Inc.

14

The Function Point Methodology


14 Optional General Systems Characteristics are evaluated and used to compute a Value Adjustment Factor (VAF)

General System Characteristics


Data Communication Distributed Data Processing Performance Objectives Heavily Used Configuration Transaction Rate On-Line Data Entry End-User Efficiency On-Line Update Complex Processing Reusability Conversion & Install Ease Operational Ease Multiple-Site Use Facilitate Change

The final calculation is based upon the Unadjusted FP count X VAF


Copyright 2005. The David Consulting Group, Inc. 15

Function Point Calculation


Enhancement FPs as they relate to existing master count:
External Inputs (EI) (2) Add/Change Account; change; high complexity; total unadjusted FPs = 2 x 6 = 12 External Input (EI) Issue Material; change; high complexity; total unadjusted FPs = 1 x 6 = 6 External Input (EI) Add Tax; change; low complexity; total unadjusted FPs = 1 x 3 = 3

Total Unadjusted FPs: 21 Value Adjusted Factor: 1.01 Total Adjusted FPs: 21

Copyright 2005. The David Consulting Group, Inc.

16

Example Counting Accuracy


Estimate: Resulting size = 150 function points Matching profile, rate of delivery = 10 FP/EM Estimated effort = 15 effort months Actuals: Size 175 fps Effort 19 effort months Results +17% +27%

Copyright 2005. The David Consulting Group, Inc.

17

Example Scope Accuracy


Counting Activity Resulting Requirements size = Resulting Design size = Resulting Install size = FP 120 144 174

Analysis:
Inputs Outputs Inquiries Interfaces Files 75 10 15 7 13 80 25 17 7 15 95 40 17 7 15
Copyright 2005. The David Consulting Group, Inc.

Total 120 144 174


18

Project Estimation

DEFINITION

CAPABILITY

ESTIMATE

Schedule

REQUIREMENT

PROJECT SIZE

PROJECT COMPLEXITY

RISK FACTORS

Effort

Costs

FUNCTION POINT ANALYSIS

Copyright 2005. The David Consulting Group, Inc.

19

Capability Analysis
Collect project data
Project metrics (e.g., effort, size, cost, duration, defects) Project characteristics Project attributes (e.g., skill levels, tools, process, etc.) Project complexity variables

Analyze data
Performance comparisons (identification of process strengths and weaknesses) Industry averages and best practices Performance modeling (identify high impact areas)

Copyright 2005. The David Consulting Group, Inc.

20

DCG Data Base


Characteristics
Project Type Platform Data Base Method Language

Complexity Variables
Logical Algorithms Mathematical Algorithms Data Relationships Functional Size Reuse Code Structure Performance Memory Security Warranty

Metrics
Size Cost Effort Duration Defects Management Definition Design Build Test Environment

Attributes

Process Skill Levels Quality Practices Measures

Copyright 2005. The David Consulting Group, Inc.

21

Quantitative & Qualitative Assessments


Research
MEASURES CHARACTERISTICS

Software Size Level of Effort Time to Market Delivered Defects Cost

Skill Levels Automation Process Management User Involvement Environment

Analysis

PERFORMANCE LEVELS

PROFILES

Results

Correlate Performance Levels to Characteristics Substantiate Impact of Characteristics Identify Best Practices
22

Copyright 2005. The David Consulting Group, Inc.

Estimating Using Historical Delivery Rates

DEFINITION

CAPABILITY

ESTIMATE

Schedule

REQUIREMENT

PROJECT SIZE and COMPLEXITY

RATE OF DELIVERY

Effort

Costs

FUNCTION POINT SIZE

FUNCTION POINTS Per EFFORT MONTH

Copyright 2005. The David Consulting Group, Inc.

23

Analysis of Results
Analyze estimating accuracy
Plan vs. actual comparisons Effectiveness of delivery rates

Evaluate the system level documentation


Change in scope (size) through the various stages Clarity of requirements and design documents

Recommend improvements
Improve the level of documentation for more accurate sizing Establish a more effective estimating practice

Copyright 2005. The David Consulting Group, Inc.

24

Develop a Baseline of Data


Product Deliverable
C B

Performance
D D

Profiles Size Platform Language

Time to Deliver - Duration - Number of days Level of Effort Defects

SIZE
A B C D : 136 276 435 558 759

PROJECT MEASURES

PROFILES

10 mnths 35 effort mnths 10 defects

Rate of Delivery Time to Market Defect Density

Copyright 2005. The David Consulting Group, Inc.

25

Establish A Baseline
Size is expressed in terms of functionality delivered to the user Software
Size 2200 2000 1800 1600 1400 1200 1000 800 600 400 200 0 0 2 4

Performance Productivity
A representative selection of projects is measured

Organizational Baseline

8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery
Function Points per Person Month

Rate of delivery is a measure of productivity

Copyright 2005. The David Consulting Group, Inc.

26

Compare To Industry Benchmarks


Industry baseline performance
2200 2000 1800 1600 1400 Software 1200 Size 1000 800 600 400 200 0 0 2 4 6 8 10 12 14 16 18 20 22 24 26 28 30 32 34 36 Rate of Delivery
Function Points per Person Month

Copyright 2005. The David Consulting Group, Inc.

27

Function Points Per Person Month


Average of Recent Projects Across Different Platforms Client Server Main Frame Web e-business Web Vendor Packages Data Warehouse 15 13 22 12 19 11

Copyright 2005. The David Consulting Group, Inc.

28

Function Points Supported By One FTE


Average of Support Provided for Corrective Maintenance by One FTE Client Server Main Frame AS 400 Web e-business Web Vendor Packages Data Warehouse 642 943 597 748 464 760 546

Copyright 2005. The David Consulting Group, Inc.

29

Analyze Results
COLLECT QUANTITATIVE DATA COLLECT QUALITATIVE DATA

Collection

Size Effort Duration Cost Quality Measured Performance

Process Methods Skills Tools Management Capability Profiles

Analysis Results Action

Baseline Performance
Opportunities For Improvement Best Practices
30

Copyright 2005. The David Consulting Group, Inc.

Model Performance
Develop parametric models that utilize historical data points for purposes of analyzing the impact of selected process improvements Provide a knowledge base for improved decision making

Identify areas of high impact (e.g., productivity and quality


Create an atmosphere of measuring performance

Opportunity for comparison to industry best practices


Copyright 2005. The David Consulting Group, Inc. 31

Quantitative Performance Evaluation


Quantitative Assessment
COLLECT QUANTITATIVE DATA

Size Effort Duration Cost Quality Measured Performance

Perform functional sizing on all selected projects. Collect data on project level of effort, cost, duration and quality. Calculate productivity rates for each project, including functional size delivered per staff month, cost per functional size, time to market, and defects delivered.

Results
Baseline Productivity 133 10.7 6.9 $939 0.0301

Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP

Copyright 2005. The David Consulting Group, Inc.

32

Qualitative Performance Evaluation


COLLECT QUALITATIVE DATA

Process Methods Skills Tools Management Capability Profiles

Qualitative Assessment Conduct Interviews with members of each project team. Collect Project Profile information. Develop Performance Profiles to display strengths and weaknesses among the selected projects.

Results
Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent

Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade

55.3 27.6 32.3 29.5 44.1 17.0 40.2 29.2 22.7 17.6 40.6 23.5 49.0 49.3 22.8

47.73 50.00 29.55 31.82 31.82 22.73 45.45 56.82 36.36 43.18 56.82 29.55 38.64 54.55 31.82

82.05 48.72 48.72 43.59 53.85 43.59 23.08 28.21 43.59 23.08 71.79 48.72 56.41 74.36 38.46

50.00 11.36 0.00 0.00 34.09 0.00 38.64 22.73 0.00 0.00 0.00 0.00 52.27 20.45 0.00

46.15 38.46 42.31 30.77 38.46 15.38 53.85 26.92 30.77 26.92 38.46 38.46 30.77 53.85 11.54

43.75 0.00 37.50 37.50 53.13 0.00 50.00 18.75 9.38 9.38 43.75 6.25 53.13 50.00 25.00

50.00 42.31 42.31 42.31 42.31 30.77 34.62 53.85 30.77 26.92 38.46 26.92 53.85 38.46 46.15

Copyright 2005. The David Consulting Group, Inc.

33

Modeled Improvements
Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent

Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade

55.3 27.6 32.3 29.5 44.1 17.0 40.2 29.2 22.7 17.6 40.6 23.5 49.0 49.3 22.8

47.73 50.00 29.55 31.82 31.82 22.73 45.45 56.82 36.36 43.18 56.82 29.55 38.64 54.55 31.82

82.05 48.72 48.72 43.59 53.85 43.59 23.08 28.21 43.59 23.08 71.79 48.72 56.41 74.36 38.46

50.00 11.36 0.00 0.00 34.09 0.00 38.64 22.73 0.00 0.00 0.00 0.00 52.27 20.45 0.00

46.15 38.46 42.31 30.77 38.46 15.38 53.85 26.92 30.77 26.92 38.46 38.46 30.77 53.85 11.54

43.75 0.00 37.50 37.50 53.13 0.00 50.00 18.75 9.38 9.38 43.75 6.25 53.13 50.00 25.00

50.00 42.31 42.31 42.31 42.31 30.77 34.62 53.85 30.77 26.92 38.46 26.92 53.85 38.46 46.15

Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP

Baseline Productivity 133 10.7 6.9 $939 0.0301

Process Improvements: Code Reviews and Inspections Requirements Management Defect Tracking Configuration Management
Project Nam e Profile Score Managem ent Definition Design Build Test Environm ent

Performance Improvements: Productivity ~ +131% Time to Market ~ -49% Defect Ratio ~ -75%

Accounts Payable Priotity One HR Enhancements Client Accounts ABC Release Screen Redesign Customer Web Whole Life Regional - East Regional - West Cashflow Credit Automation NISE Help Desk Automation Formula One Upgrade

75.3 57.6 52.3 69.5 74.1 67.0 59.2 50.2 57.7 52.6 67.6 60.5 79.0 79.3 52.8

61.73 57.00 32.55 53.82 55.82 43.73 49.45 49.82 59.36 55.18 66.82 41.55 68.64 64.55 49.82

82.05 55.72 51.72 65.59 69.85 63.59 27.08 32.21 49.59 30.08 71.79 78.72 76.41 74.36 52.46

60.00 18.36 23.00 12.00 49.09 21.00 58.64 27.73 0.00 0.00 0.00 0.00 62.27 47.45 0.00

60.15 45.46 42.31 50.77 52.46 36.38 53.85 31.92 30.77 33.92 49.46 50.46 65.77 63.85 31.54

53.75 22.00 57.50 67.50 63.13 20.00 54.00 24.75 9.38 19.38 53.75 26.25 53.13 54.00 25.00

50.00 49.31 49.31 49.31 49.31 51.77 49.62 53.85 50.77 26.92 49.46 46.92 53.85 58.46 56.15

Average Project Size Average FP/SM Average Time-To-Market (Months) Average Cost/FP Delivered Defects/FP

Productivity Improvement 133 24.8 3.5 $467 0.0075

Copyright 2005. The David Consulting Group, Inc.

34

Conclusions
Project Management can be successful Requirements can be managed Projects can be sized Performance can be successfully estimated Process improvement can be modeled Measurement can be accomplished

Copyright 2005. The David Consulting Group, Inc.

35

Contact Information

David Consulting Group web site: www.davidconsultinggroup.com David Garmus dg@davidconsultinggroup.com

Copyright 2005. The David Consulting Group, Inc.

36

Contact Information
International Function Point Users Group (IFPUG) www.ifpug.org Practical Software and Systems Measurement (PSM) www.psmsc.com Software Engineering Institute (SEI) www.sei.cmu.edu Software Quality Engineering (SQE) www.sqe.com Quality Assurance Institute (QAI) www.qaiusa.com

Copyright 2005. The David Consulting Group, Inc.

37

Você também pode gostar