Você está na página 1de 24

Greg Reiser,

ThoughtWorks
Updated November 18
2011
ISG SW Engineering - Lean Enterprise /Agile Transformation

Proposed Program Dashboard
22 November 2011

"Our highest priority is to satisfy the customer through
early and continuous delivery of valuable software.

"Working software is the primary measure of progress.

http://agilemanifesto.org/principles.html
Source - Dave Nicolette
Guiding Principles for Metrics
Measure outcomes, not activity.

Various authors
Agile Balanced Metrics (Forrester)
Operational Excellence
Project Management
Productivity
Organizational Effectiveness
Quality
User Orientation
User Satisfaction
Responsiveness to needs
Service Level Performance
IT Partnership
Business Value
Business value of projects
Alignment with strategy
Synergies across business units
Future Orientation
Development capability improvement
Use of emerging processes and
methodologies
Skills for future needs
Agile Balanced Metrics (NCR)
Operational Excellence
New-Functionality Ratio
Internal Code Quality
Build Hygiene
Percent Accurate and Complete
Defect Resolution Time
User Orientation
Customer Satisfaction
External Customers
Internal Customers
Business Value
Feature Lead Time
Defects
Cost per Feature Point
Future Orientation
Teams Agile Maturity
People
Number of Agile Practitioners
Number of Agile Leads
Future Orientation Agile Maturity
What it is Measure of team agility (ability to respond to customer demand and
change) based on ThoughtWorks Agile Maturity Model.
Measurement Qualitative assessment along 10 dimensions of software development





Purpose Assess how teams are progressing towards a targeted future state
Caveat The AMM is not a compliance tool. Its intent is to define the current state of a
software development team with respect to agile principles and practices, develop a
plan for change and track progress against that plan.
Future Orientation Agile Practitioners and Leaders
What it is The number (and rate of increase) of qualified Agile Practitioners and
Leaders in ISG Software Engineering
Measurement Use a variation of Net Promoter Score to assess individuals
On a scale of 0 to 10, this person does not require coaching support in order to be a positive
contributor on an agile team.
On a scale of 0 to 10, this person is effective as an agile coach for one or more roles



Initial assessments will be performed by ThoughtWorks coaches. As NCR staff achieve Practitioner
and Leader status they will assume this responsibility
Purpose Monitor the rate at which NCR staff are developing agile expertise. Ensure
that NCR are on track to developing the skills required to support broad agile adoption
Student
Practitioner
Leader
Training
Project Experience
Coaching
Mentoring
Focused Coaching
Ongoing Support
Operational Excellence New Functionality Ratio
What it is The ratio of effort (cost) spent on new feature-functionality vs. support
Measurement
Hours-New-Functionality:Hours-Support

Purpose Well-run Agile teams generate higher quality and fit-for-purpose
functionality, and minimize the amount of effort spent on marginal value features. This
translates into lower failure, customer service and other support costs. This in turn
translates into increased capacity to innovate and satisfy customer demand.
Recommendation
When developing new functionality, the cost of defects discovered at the end of the development
lifecycle should be counted as Support (Appraisal or Internal Failure costs)
When developing new functionality, the cost of defects discovered earlier in the development cycle
should be counted as New Functionality costs (Prevention costs)
Defects reported by consumers of shared components (Product and Solution Teams) should be
treated as Support (External Failure costs) by Component Teams
Operational Excellence Internal Code Quality
What it is Metrics that describe the design quality of the software in a product
Measurements
The four Cs: Coverage, Complexity, Cohesion, Coupling
See Appendix for details
Purpose Compares a codebase to generally accepted guidelines for good design.
Identifies opportunities for making software more malleable. Increasing adherence to
such guidelines plus decreasing Defect Resolution Time and shorter Feature Lead Time
are all indicators of reducing technical debt.
Tools such as Sonar can collect and report on a broad range of metrics; its better to
focus on a small subset and adapt with caution as you learn how to use metrics to drive
desired behavior.
Operational Excellence Internal Code Quality:
Dashboard Example
Operational Excellence Build Hygiene
What it is Number of builds and Percentage of successful builds in a given timeframe.
This can be easily monitored through a Sonar plugin.
Measurement
Number of builds (in a given timeframe)
Build Success % (in a given timeframe)

Purpose Drive a desired development behavior
Example
Number of builds = 15 builds in the last 30 days
Build Success = 86.7% in the last 30 days
Operational Excellence Percent Accurate and Complete
What it is Percent of stories that do not revert to an earlier state
Measurement
(Ideal State Transitions / Total State Transitions) x 100

Purpose Indicator of rework (waste). Measuring at the individual state transition level
identifies opportunities for continuous improvement.
Example
Story Lifecycle: Not Started Analysis Development Testing Acceptance Testing Deployed
Size of Backlog = 100 Stories; Hence, ideal number of forward state transitions for entire project is
500 (5 x 100)
Actual Experience:
20 instances of stories reverting from UAT to Testing
50 instances of stories reverting from Testing to Development (may include many of the above 20)
PAC (Dev-to-Test) = (100/150) x 100 = 67%
PAC (Test-to-UAT) = (100/120) x 100 = 83%
PAC (Project) = (500/570) x 100 = 88%
Operational Excellence Defect Resolution Time
What it is Average time between defect identification and resolution
Measurement
Defect Closed Timestamp Defect Open Timestamp

Purpose Demonstrates the malleability of the codebase and the extent to which the
team has adopted a zero-defect culture
Malleability of the Codebase Disciplined agile teams strive to limit technical debt as
much as possible. Technical debt is assessed at two levels, defects and code metrics
that indirectly describe how easy it is to maintain and enhance the software
(malleability). Rapid resolution of defects is one measure of the business benefit of low
technical debt.
Business Value Feature Lead Time
What it is The average time to deploy a feature from the time that the development
team begins work on it
Measurement
Date Feature Deployed Date Analysis begins on first relevant story
Deployed = Feature is Shippable
Shippable (Component Teams) = Binary is fully tested and available for consumption by Product Teams
Shippable (Product Teams) = Feature is part of a fully-tested release. Customer deployment is strictly a
business decision.
Purpose Measures the responsiveness of the team once a feature has been identified
as the next highest priority.
Why not start measurement when the feature is first identified?
It is much more important to be responsive with respect to higher priority features. If the metric
starts at feature identification time teams will be tempted to work on the easiest features regardless
of priority.
If consumers are not receiving high quality components fast enough to meet their business
commitments they dont need a metric to tell them that. If the root cause is determined to be high
priority features queuing up to get started, this indicates an obvious capacity issue rather than a
process issue.
Business Value Defects
What it is Number of defects reported after a story has been flagged as Done by the
testers that are embedded in the development team
Measurement
Consider the following story life cycle:
New In Analysis Ready for Dev In Dev Ready for Test In Test Done
Track defects reported by any downstream activities (e.g., component integration test, controlled
deployment, professional services, external customer, etc.)
Raw number of defects reported per severity level and time (activity) of detection
Report in terms of density (per KLOC) and technology stack when comparing across projects
Purpose One indicator of the quality of software produced
Why not record defects identified earlier in the story lifecycle?
Testing within a Sprint is a defect prevention cost (up-front acceptance criteria and TDD are other
examples of defect prevention). We want to encourage defect prevention by focusing on the
expected reduction in defect appraisal, internal failure and external failure costs. Hence the focus on
defects that are indicators of those other quality costs.

Business Value Cost per Feature Point
What it is Cost per deployed unit of business value, where units are feature points
as defined by the Product Owner
Measurement
Development-Cost / Feature-Points-Deployed
Development Cost = Development costs incurred within a specific time frame. If comprehensive
development costs are difficult to obtain, hours of effort may serve as a reasonable proxy.
Feature Point = Relative units of business value for a feature as determined by the Product Owner
(Solution Manager)
Feature-Points-Deployed = Sum of feature points for those features deployed (shippable) during the
target time frame
Purpose The direction of change indicates if teams and the organization are becoming
more or less productive
Recommendation Trend is more important than raw value. If used to compare teams,
limit comparison to teams that serve the same line of business. Since feature points are
subjective values assigned by Product Owners, comparisons are only valid where there
is consistency amongst people that work together.
User Orientation Customer Satisfaction
What it is Net Promoter Score (NPS) for NCR, ISG Software Engineering and
individual project teams
Measurement
Use the Net Promoter Score methodology (http://en.wikipedia.org/wiki/Net_Promoter) with the
following customer groups:
External Customers (e.g., Kohls, Toys R Us, Tesco, etc.)
Professional Services and ISG Solution Management That is, the customers of ISG Software
Engineering
Consumers of Component Team products For example, the Vision, Travel Air and SSCO teams are
consumers of components develped by the P24 team
Purpose Simple way to collect and monitor how well ISG Software Engineering is
responding to the needs of its customers at multiple levels. Reinforces a customer
centric culture even for teams that are several levels detached from NCRs external
customers.
Appendix
Operational Excellence Internal Code Quality:
Coverage
What it is a measurement of the extent to which lines and branches
are executed as part of some test.
Measurements

Lines of code reached by a unit test / Executable lines of code

(Branches that evaluate to True at least once + Branches that
evaluate to False at least once) / (2 * Total number of branches)

Purpose indicates how much code isnt executed by tests
Example


Operational Excellence Internal Code Quality:
Complexity
What it is McCabe Metric or Cyclomatic Complexity Number; the
number of independent flow-paths through a method/function.
Measurement

(no. of branches in method: if, for, &&, ||, etc.) + 1

Purpose a quantitative indicator of the complexity of code
Example a method with a CCN of 3

+
1
2
3
Operational Excellence Internal Code Quality:
Cohesion
What it is A measure of single responsibility principle of a class.
High cohesion is a good thing
Measurement

Number of Connected methods and fields

Purpose Measures the number of connected components in a class
Example

private final String privKey = readSecurelyFromSomewhere();
private String encode(String plainText, String pubKey) { }
private String decode(String encrypted, String pubKey) { }


public void login(String uname, String pwd) { }
public void logout(String uname) { }
Operational Excellence Internal Code Quality: Coupling
What it is a measure of dependency both of and on a class. Low
coupling is a good thing
Measurement

Afferent number of other classes that use this class
Efferent number of other classes used by this class

Purpose Measures the number of adjacent classes in dependency tree
Example


log()
toString()
hashCode()
equals()
T clone<T> (T me)


Utility
Logger
StringBuilder
EqualsBuilder
Many
Other
Classes
Mnemonic:
Afferent: Arrive at class
Efferent: Exit class
Remember the Future Exercise
0
5
10
15
20
25
30
Responsive
Innovative
Efficient
Transparent
Partnership Quality
Time to market
Feedback
Predictability
Higher value / lower cost
Reduce waste (marginal
value features)
Clear backlog short/long
term
Competency
Less rework; shorter cycle
time
Dollar value of software
developed
Clarity of customer
requirements
Understanding of goals
Reusing software across
enterprise
Sustainable rhythms
Being innovative
Innovative
Innovative/disruptive
technology
New product ideas
introduced into products
Metrics Template
What it is
Measurement
ABC
Purpose
Example
Scorecard Graphic
Operational Excellence
User Orientation Business Value
Future Orientation

Você também pode gostar