Escolar Documentos
Profissional Documentos
Cultura Documentos
Prof. Wei T. Huang Department of Computer Science and Information Engineering National Central University 2007
1 WTH07
Prerequisites
z z z
Note: This lecture note is developed for the teaching purpose and is only offered to the senior and graduate students at the National Central University. Any commercial activity using this note is not allowed. Most of the materials in this lecture note, including the text, the tables, and the figures, are excerpted from the sources given in the References, especially from [Fenton97].
2 WTH07
Measurement helps people to understand the world. Without measurement you cannot manage anything. There are three important activities in software development project:
Understanding what is happening during development and maintenance Controlling what is happening on the projects Improving the processes and products
Thus, people must control their projects and predict the product attributes not just run them. But
You cannot control what you cannot measure. (DeMarco, 1982) You can neither predict nor control what you cannot measure. (DeMacros rule)
3 WTH07
Software Measurement
z
Customers measure
whether the final product meets the requirements whether the product is of sufficient quality
1. Measurement
5 WTH07
Measurement is essential to our daily life. It is the process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules.
Entity: an object or an event in the real world
A person, a room; a journey; the testing phase of a software project
6 WTH07
So, a software engineer may continue to claim that important software attributes, such as dependability, quality, usability and maintainability, in order to make software engineering as powerful as other engineering disciplines.
7 WTH07
Fail to understand and quantify the component costs of software projects. Do not quantify or predict the quality of the products we produce. Do without a carefully controlled study to determine if the technology is sufficient and effective.
8 WTH07
Engineers
9 WTH07
10 WTH07
Value
Cost
Data collection
The collected data distilled into simple graphs or charts
z z z z
Reliability model Performance evaluation and models Structural and complexity metrics Management metrics
Using measurement-based charts or graphs to help customers and developers decide if the project is on track.
12 WTH07
Exercises
z z
What are metrics? Name the reason why metrics are useful. Specifications, design and code are entities of software products.What are the attributes of those entities?
Hint: see next chapter.
Personnel is one of the resources for software development. Suppose you are a manager. What do you want to measure such an entity?
Hint: see next chapter.
13 WTH07
2. Classifying Measures
14 WTH07
Products: any artifacts, deliverables or documents that result from a process activity.
External product attributes depending on product behavior and environment: reliability, usability, integrity, efficiency, testability, reusability, portability and interoperability are attributes that we can measure. Internal product attributes: size, effort, cost, code complexity, structuredness, module coupling and cohesiveness.
15 WTH07
Attributes
Internal attributes: measured by examining the product, process or resource its own. External attributes: product, process or resource related to its environment.
16 WTH07
Attributes
External
Design
Code
Test data
The Goal-Question-Metric (GQM) paradigm In order to decide what your project should measure, you may use GQM paradigm. The following high-level goals may be identified.
Improving productivity Improving quality Reducing risk
Perspective
Example: Examine the cost from the viewpoint of the manager
Environment
Example: the maintenance staff are poorly motivated programmers who have limited access to tools
19 WTH07
A framework of GQM
List the major goals of the development or maintenance project. Derive from each goal the questions that must be answered to determine if the goals are being met. Decide what must be measured in order to be able to answer the questions adequately.
20 WTH07
21 WTH07
Examples of AT&T goals, questions and metrics (Barnard and Price 1994)
22 WTH07
Data collection
Gathering accurate and consistent measures of process and resource attributes.
23 WTH07
Reliability model
Successful operation during a given period of time.
Capability-maturity assessment
SEI CMM
24 WTH07
Management by metrics
Use metrics to set targets for the development projects. Example: US defense projects (NetFocus 1995) Item
Defect removal efficiency Original defect density Slip or cost overrun in excess of risk reserve Total requirements creep (function points or equivalent) Total program documentation Staff turnover
Target
> 95% < 4/function point 0%
Malpractice level
< 70% > 7/function points (*) > 10%
< 1%/month > 50% average < 3 pages/function > 6 pages/function point point 1-3%/year > 5%/year
25 WTH07
(*) Function points: to measure the amount of functionality in a system as described by a spec.
27 WTH07
28 WTH07
Exercises
z
You are now taking the course Software Metrics. What are the goals you take this course.
Hints: Using the template of goal definitions
Use the GQM approach to explain the reason you take the Software Metrics course.
29 WTH07
30 WTH07
Simple measures of size are often rejected because they do not adequately reflect:
Effort Productivity Cost
31 WTH07
Reuse: how much of a product was copied or modified from a previous version of an existing product (including off-the-shell products).
32 WTH07
Software size
Specification
a useful indicator of how the design is likely to be
Design
a predictor of code length
Code
33 WTH07
34 WTH07
35 WTH07
36 WTH07
An example
In the Visual Basic programming environment, you can create a sophisticated Windows program, complete with menus, icons, and graphic, with almost no code in traditional sense. For example, you point at a scrollbar object in the programming environment. In this environment, the executable code to produce a scrollbar is constructed automatically. You need to write code only to perform the specific actions that result from a click on a specific command button. In this kind of environment, it is not clear how you would measure length of the program. Thus, a program with just five BASIC statements, say, can easily generate an executable program of 200 Kb. This is the same case by using component-based software construction technique.
In object-oriented development, a count of objects and methods led to more accurate productivity estimates than those using lines of code (Pfleeger 1989). Ontological principles (Bunges ontological terms) (*)
Two objects are coupled if and only if at least one of them acts upon the other. X is said to act upon Y if the history of Y is affected by X.
(*) Ontology: the common words and concepts (the meaning) used to describe and represent an area of knowledge. Suppose you look up ontology in the dictionary, you will find that the metaphysics are: (1) A branch of philosophy that seeks to explain the nature of being and reality; (2) speculative philosophy in general (Websters New World Dictionary). An ontology is the specification of conceptualization for engineering product. 39 WTH07
The Metrics Suite for OOD [Chidamber/Kemerer] using the notion mentioned above.
Metric 1. Weighted Methods Per Class (WMC) = ci (i=1 to n) where consider a Class C, with methods M1..Mn that are defined in the class. ci..cn be the complexity of the methods. If all method complexities are considered to be unity, then WMC = n, the number of methods. Metric 2. Depth of Inheritance Tree (DIT): The length from the node to the root of the inheritance tree.
40 WTH07
41 WTH07
A document of specification and design may consist of both text and diagrams, where diagrams have a uniform syntax, such as DFD, Zschema or class diagrams. We can define appropriate atomic objects for the different type of diagrams and symbols. The well-known method for handle these atomic objects, for instance:
For data flow diagram, the atomic objects are process, external entities, data stores, and data flows. For algebraic specification, the atomic entities are sorts, functions, operations, and axioms, etc. For Z schemas, the atomic entities are various lines appearing in the specification, such as type declaration or a predicate.
42 WTH07
Diagram
Data flow diagram Data dictionary Entity relation diagram State transition diagram
Atomic objects
Bubbles Data elements Objects, relations States, transitions
43 WTH07
44 WTH07
COORD sort Coord Imports INTEGER, BOOLEAN Create (Integer, Integer) Coord; X (Coord) Integer; Y (Coord) Integer; Eq (Coord, Coord) Boolean;
Operation signatures setting out the names and the types of the parameters to the operations defined over the sort
X (Create (x,y)) = x Y (Create (x,y)) = y Eq (Create (x1,y1), Create (x2,y2)) = ((x1 = x2) and (y1 = y2))
45 WTH07
predicate
new state telephones = telephones {huang | 0543} where telephones = {lee | 1234, wang | 2345, }
46 WTH07
Length may be predicted by considering the median expansion ratio from spec or design length to code length.
Expansion ratio (design to code) = size of design / size of code
47 WTH07
The reuse of software (including requirements, designs, documentation, test data, scripts, code, etc.) improves the productivity and quality, allowing the developer to concentrate on new problems. HPs example (Lim 1994):
Organization Quality Productivity Time to market Manufacturing productivity 51% defect reduction 57% increase NA San Diego technical graphics 24% defect reduction 40% increase 42% reduction
Reuse at HP 90
80 70 Percent reuse 60 50 40 30 20 10 0 0.55 0.73 0.80 2.21 2.85 3.09 000 of noncomment source lines 49 WTH07
Reusable LOC
40 900 34 000 18 300 18 300 40 900
Total LOC
82 300 73 000 50 100 52 700 82 900
Reuse ratio
50 47 37 35 49
50 WTH07
Albrechts approach: Function points that are intended to measure the amount of functionality in a system as described by a specification. Using the following items of types to compute an unadjusted function point count (UFC):
External inputs: items provided by the user that describe distinct application-oriented data (such as files), not including inquiries. External output: items provided to the user that generate distinct application-data (such as reports and messages). External inquiries: interactive inputs requiring a response. External files: machine-readable interfaces to other systems. Internal files: logical master files in the system.
51 WTH07
FP complexity weight (wi): Item Low Complexity 3 4 3 7 5 Medium Complexity 4 5 4 10 7 High Complexity 6 7 6 15 10
52 WTH07
External inputs External outputs External inquiries External files Internal files
53 WTH07
For the example of the spelling checker, the items are identified as follows:
2 external inputs: document filename, personal dictionary-name 3 external outputs: misspelled word report, # of words processed message, # of errors message 2 external inquiries: words processed, errors 2 external files: document file, personal dictionary 1 internal file: dictionary
54 WTH07
The complexity rates: simple, average, or complex For the spelling checker example, we assume that the complexity is average, then
UFC = 4A +5B +4C +10D +10E = 61
If the dictionary file and the misspelled word report are complex, then
UFC = 4A + (5 2 + 7 1) + 4C + 10D + 10E = 63
Technical complexity factor, TCF = 0.93 (0.65 to 1.35 see next slide) for the spelling checker, then
FP = 63 0.93 = 59
What is the FP for? Suppose a task takes a developer an average of two person days of effort to implement a function point. Then we may estimate the effort to complete the spelling checker is 118 person days (59 2).
55 WTH07
The Durch method. IndicativeFunctionPointCount = (35 x Internal files) + (15 x External files) The numbers 35 and 15 are derived through calibration. However, you can come up with your own calibrations for use in your environment.
Use the Dutch Method of counting function points to attain a low-cost ballpark estimate early in the project. Example: For the spelling checker UFC = 35 x 1 + 15 x 2 = 65 (matching the result in shown in slide 65)
57 WTH07
F1 F2 F3 F4 F5 F6 F7 F8 F9
Fi
i =1
14
For the spelling checker: F3 F5F9F11F12 F13 are 0 (sub-factor is irrelevant), F1F2F6 F7F8F14 are 3 (average) F4 and F10 are 5 (it is essential to the systems being built). So, TCF = 0.65 + 0.01(18 + 10) = 0.93
58 WTH07
Note: The factor varies = 0.65 (if each Fi is set to 0) = 1.35 (if each Fi is set to 5).
3.9 Complexity
z
We define
Complexity of a problem: the amount of resources required for an optimal solution to the problem. Complexity of a solution: be regarded in terms of the resources needed to implement a particular solution.
Time complexity: where the resource is computer time Space complexity: where the resource is computer memory
In order to measure and express complexity, we measure the efficiency of a solution, that is, measuring algorithmic efficiency.
Example: Binary search
For a list of n elements, the binary search algorithm terminates after at most ( n) comparisons.
Big-O notation
Example: the problem of searching a sorted list for a single item can be shown to have complexity O( n), that is, the fastest algorithm necessary to solve the problem requires at least ( n).
59 WTH07
Exercise
1. Explain very briefly the idea behind Albrechts function points
measure. 2. List the main applications of function points. 3. Compare function points with the line of code measure.
60 WTH07
61 WTH07
4.1 Structure
z
The structure of requirements, design, and code may help the developers to understand the difficulty they sometimes have in converting one product to another, in testing a product, or in predicting external software attributes from early internal product measures, such as maintainability, testability, reusability, and reliability. The structure of a product plays a part, not only in requiring development effort but also in how the product is maintained. Types of structural measures
Control-flow structure: the sequence in which instructions are executed in a program. Data-flow structure: the trail of a data item created or handled by a program. Data structure: the organization of the data itself, independent of the program.
62 WTH07
63
WTH07
64 WTH07
The cyclomatic number is a useful indicator of how difficult a program or module will be to test and maintain. When V exceeds 10 in any one module, the module may be problematic. Example: Channel Tunnel Rail System
The module be rejected if its V exceed 20 or if it has more than 50 statements (Bennet 1994).
65 WTH07
Module: a contiguous sequence of program statements, bounded by boundary elements, having an aggregate identifier (Yourdon and Constantine 1979).
A module can be any object. A program, unit, procedure, or function.
Inter-module attributes.
Example: Design charts (excerpted from [Fenton97])
66 WTH07
67 WTH07
Coupling is the degree of interdependence between modules (Yourdon and Constantine 1979). Classification for coupling (Ri > Rj for i > j):
R0: module x and module y have no communication. R1 Data coupling relation: x and y communicate by parameters, where each parameter is either a single data element or a homogeneous set of data items (no control element). This type of coupling is necessary for any communication between modules. R2 Stamp coupling relation: x and y accept the same record type as a parameter. R3 Control coupling relation: x passes a parameter (flag) to y with the intention of controlling its behavior. R4 Common coupling relation: x and y refer to the same global data. R5 Content coupling relation: x refers to the inside of y
z z
M1 and M2 share two common record types: R2 M1 passes to M3 a parameter that acts as a flag in M3: R1 M2 branches into module M4 and passes two parameters that act as flags in M4: R3 and R5
z
Measuring coupling between x and y: c(x, y) = i + n/(n+1), where i is the number corresponding to the worst coupling relation Ri between x and y, and n is the number of interconnections between x and y (Fenton and Melton 1990).
69 WTH07
70 WTH07
71 WTH07
Exercises
1. The following flowgraph is a truly unstructured spaghetti prime.
What is the essential complexity. 2. A good design should exhibit high module cohesion and low module coupling. Briefly describe what you understand this assertion to mean. 3. McCabes cyclomatic number is a classic example of a software metric. Which software entity and attribute do you believe it really measures?
73 WTH07
74 WTH07
External attributes
Software quality Quality impacts: Time Quality Effort
Predicting external attributes via measuring and analyzing internal attributes, because
The internal attributes are often available for measurement early in the life cycle, whereas external attributes are measurable only when the product is complete. Internal attributes are often easier to measure than external ones.
75 WTH07
76 WTH07
Functionality: the functions supplied by the product to the user Portability (*):
A set of attributes that bear on the capability of software to be transferred from one environment to another. Portability = 1 ET/ER where ET is a measurement of the resources needed to move the system to the target environment, and ER is a measure of the resources needed to create the system for the resident environment.
77 WTH07
Reliability (*)
A set of attributes that bear on the capability of software to maintain its level of performance under stated conditions and a stated period of time. defect density = # of known defects/product size where product size is measured in terms of LOC, and the known defects are discovered through testing, inspection or other techniques.
78 WTH07
Other quality measures system spoilage = time to fix post-release defects /total system development time Hitachi example:
79 WTH07
Efficiency.
Example: Suppose we implement the Heapsort sorting algorithm in a machine environment where comparison operations are performed at the rate of 220/sec. If we sort a list of n = 225 items, we need nln(n) (n=25 in this case) comparisons, that is, we need 25 x 225 comparisons. The response time of the machine must at leat 800 seconds (13.3 minutes). In software, the efficiency can be expressed by the response time.
80 WTH07
Usability:
The extent to which the software product is convenient and practical to use (Boehm 1978). Good usability includes: Well-structured manuals Good use of menus and graphics Information error messages Help function Consistent interfaces
81 WTH07
Maintainability:
Many measures of maintainability are expressed in term of MTTR (mean time to repair). Records needed to calculate this measure: Problem recognition time Administrative delay time Maintenance tools collection time Problem analysis time Change specification time Change time (including testing and review)
z z
The guideline: Cyclomatic number < 10. Thus, maintainability in a real software system is affected by a wide range of system design decisions.
82 WTH07
83 WTH07
Exercise
z The most commonly used software quality measure in industry is
the number of faults per thousand lines of product source code. Compare the usefulness of this measure for developers and users. List some possible problems with this measure.
84 WTH07
6. Software Reliability
Software reliability is a key concern of many users and developers of software. Reliability is defined in terms of failures; it is impossible to measure before development is complete. However, there are several software reliability growth models may aid the estimation, by carefully collecting data on inter-failure times.
85 WTH07
Probability density function (pdf) f(t) describes the uncertainty about when the component will fail :
Probability of failure between t1 and t2 = f(t) dt
t1 t2
86 WTH07
A component has a maximum life span of 10 hours, e.g., it is certain fail within 10 hours of use. Suppose that the component is equally likely to fail during any two time periods of equal length within the 10 hours. It is just as likely to fail in the first two minutes as in the last two minutes. We can illustrate this behavior with the pdf f(t) as shown in the above figure. The function is defined to be 1/10 for any t between 0 to 10, and 0 for any t>10. We say it is uniform in the interval of time from t=0 to t=10. For any x we can define the uniform pdf over the interval [0,x] to be 1/x for any t ime the interval [0,10] and 0 elsewhere.
87 WTH07
88 WTH07
The distribution function. The probability of failure from time 0 to a given time t, that is, the cumulative density function F(t) is the probability of failure between time 0 and t.
F(t) = f(t) dt where t [0,t]
Distribution function
Survival function
89 WTH07
Example: Distribution function and reliability function for uniform density function: Consider the pdf that is uniform over the interval [0,t]. Then f(t) = 1 for each t between 0 and 1. The cumulative density function F(t) and the survival function R(t) are shown as follows:
F(t) =
0 1 dt = t 0 f(t)dt =
R(t) = 1 F(t)
90 WTH07
91 WTH07
The mean time to failure (MTTF): the mean of the probability density function, or called expected value E(t) of T: E(T) = tf(t)dt
Examples:
For the uniform pdf: 1/x, the MTTF is 5 hours. For the pdf as the exponential function: f(t) = e-t , the MTTF is 1/
92 WTH07
93 WTH07
Reliability growth: the new component should fail less than the old one, that is, the mean fi+1 > fi Mean time between failure (MTBF) = MTTF +MTTR, where MTTR us the mean time to repair. Availability: the probability that a component is operating at a given point in time
Availability = MTTF/(MTTF + MTTR) 100%
94 WTH07
Exercises
1. Why is reliability an external attribute of software? 2. List three internal software product attributes that could affect
reliability. 3. Suppose you can remove 50% of all faults resident in an operational software system. What corresponding improvements would you expect in the reliability of the system?
95 WTH07
7. Resource Measurement
96 WTH07
Productivity
Productivity equation: productivity = size(lines of code)/effort (person-months) Difficulty of measuring effort Measuring productivity based on function-points # of function points implemented/person months
Function points: a review External inputs, External output, External inquiries, External files, Internal files The function-based measure more accurately reflects the value of output It can be used to assess the productivity of software development staff at any stage in the life cycle Measuring progress by comparing completed function points with incomplete ones
97 WTH07
% of US software projects
99 WTH07
Example: For the project rated low in use of tools, COCOMO includes an 8% increase in project effort compared to the normal (value 1.00) use of tools.
100 WTH07
101 WTH07
Exercise
z
Other than personnel, which software development resources can be assessed in terms of productivity? How would you define and measure productivity for these entities?
102 WTH07
8. Process Prediction
103 WTH07
Good Estimates
z
The process predictions guide the decision-making, from before the development begins, through the development process, during the transition of product to customer, and while the software is being maintained.
104 WTH07
What is an estimate?
An ideal case: the probability density is normal distribution.
Estimate (median) Estimate (median) (Normal distribution)
For example, the probability of completing the project in [8 months,16months] is 0.9, while in less than 12 months is 0.5
105 WTH07
Low experience (say <8years): F = 1.3 (F is the effort adjustment factor) Medium experience (8 10 years): F = 1.0 High experience (>10 years): F = 0.7
106 WTH07
Constructive Cost Model (Barry Boehm, 1970s), there are three models:
Basic model: when little about the project is known. Intermediate model: after requirements are specified. Advanced model: when design is complete. E = aSb F where E is effort in person months, S is size in thousands of delivered source instruction (KDSI), F is an adjust factor (= 1 in the basic model).
107 WTH07
Example: Telephone switching system. E = 3.6 (5000)1.2 100 000 P-M (the system will require approximately 5000 thousand of delivered source instructions KDSI)
108 WTH07
Process attributes:
109 WTH07
Personnel attributes:
Execution time constrains Main storage constrains Virtual machine volatility Computer turnaround time Virtual machine experience Analyst capability Applications experience Programming capability Language experience
110 WTH07
Suppose the development time for a 3000 person months embedded project:
2.5 (3000)0.35 = 52 months That is, the project requires 58 (3000/52) staff working for 52 months to complete the software.
111 WTH07
Boehm and his colleagues have defined an updated COCOMO, called COCOMO II, because COCOMO is inflexible and inaccurate for new techniques, such as use of tools, reengineering, application generators, object-oriented approaches.
112 WTH07
113 WTH07
Stage 1
Object points
Stage 3
FP and lang. or SLOC Equivalence SLOC as function of other variables 1.02 to 1.26 depending on conformity, precedent, early architecture SEI process, etc.
Reuse
Implicit in model
Scale
114 WTH07
Reliability db size product complexity Execution time constraints, main storage constraints, computer turnaround time Analyst capability applications experience, programmer capability, programming lang. experience
None
Reliability db size, documentation needs, product complexity Execution time constraints, main storage constraints,
None
Platform difficulty
None
Analyst capability applications experience, programmer capability, language and tool experience, continuity Use of software tools, required development schedule 115 WTH07
Use of modern None programming practices, use of s/w tools required development schedule
116 WTH07
Assuming that technical feasibility has been established, the customer really wants to know:
Product size a reasonable percentage variation; A do-able schedule a reasonable variation; The person power and dollar cost for development reasonable variation; Projection of the software modification and maintenance cost during the operational life of the system.
117 WTH07
118 WTH07
119 WTH07
120 WTH07
The Effort-Time Tradeoff Law Size = CkK1/3td4/3 where Ck is the state of technology which depends on the development environment, e.g., Ck = 10040 by an environment with on-line interactive development, structured coding, less fuzzy requirements, and machine access fairly unconstrained. Example: The SLIM software equation implies that a 10% decrease in elapsed time results in a 52% increase in total life-cycle effort
121 WTH07
To allow effort or duration estimation, introducing equation (Putnam): D0 = K / td3, where D0 is a constant called manpower acceleration.
Example: the manpower acceleration = 12.3 for new software with many interfaces and interactions with other system, 15 for stand-alone systems, 27 for re-implementations of existing system. We can derive by using the two equations (s/w equation and manpower acceleration equation), the effort or duration are: K = (Size/Ck)9/7D04/7
122 WTH07
123 WTH07
The software differential equation is very useful because it can be solved step-by-step using the Runge-Kutta solution. For example, for SIDPERS, a DoDs the Armys Standard Installation-Division Personnel System:
t (year)
0 .5 1.0 1.5 2.0 3.0 3.5 3.65 4.0
124 WTH07
Supplement
z
The overall life-cycle manpower curve can be well represented by the Norden/Rayleigh form: dy/dt = 2K a t e-at*t MY/YR where a=(1/2td2), td is the time at which dy/dt is a maximum, K is the area under the curve from t to infinity and represents the nominal life-cycle effort in man-years. The cumulative number of people used by the system at any time t is: y = K(1 e-at*t)
125 WTH07
Neglecting the cost of computer test time, inflation overtime, etc., the development cost is simply the $COST/MY (average). That is, $DEV = $COST/MY * (0.3944 K) = 40% $LC, where $LC is life-cycle cost.
126 WTH07
Exercises
1. According to the Rayleigh curve model, what is the effect of
extending the delivery date by 20%? 2. Suppose that you are developing the software for a nuclear power plant control system. Select the most appropriate mode for the project, and use the COCOMO model to give a crude estimate of the total number of person months required for the development, assuming that the estimated software size is 10,000 delivered source instructions.
127 WTH07
9. Object-Oriented Metrics
128 WTH07
129 WTH07
130 WTH07
131 WTH07
132 WTH07
133 WTH07
134 WTH07
An estimating process
1. Use analysis techniques, such as parts of speech and scenario scripts to discover a majority of the key classes in the problem domain. 2. Categorize the type of user interface found
No UI Simple, text-based UI Graphic UI Complex, drag-and-drop GUI 2.0 2.25 2.5 3.0
3. Multiply the number of key classes by the numbers from step 2. This is the early estimate of the total number of classes in the final system.
135 WTH07
4. Multiply the total number of classes from step 3 by a number between 15 and 20 (person-days from person-days per class), based on factor such as
The ratio of experienced to novice OO personnel The number of reusable domain objects in the reuse library
136 WTH07
137 WTH07
Example: Order Processing System use case diagram (extracted from OOSE courseware Chapter 4).
(*) The research by Gustav Karner of Objectory AB, 1993. Geri Schneider and Jason P. Winters, Applying Use Cases: A Practical Guide, 2nd Ed., Addison-Wesley, Boston, 2001.
138 WTH07
Weighting Actors.
Actor Type Simple Average Complex where Description Factor Program interface 1 Interactive, or protocol-driven, interface 2 Graphical interface 3
A Simpler Actor represents another system with a defined application programming interface An Average Actor is either another system that interacts through a protocol such as TCP/IP, or it is a person interacting through text-based interface such as ASCII terminal. A Complex Actor is a person interacting through a graphical user interface.
139 WTH07
So, 2 simple * 1 = 2 1 average * 2 = 2 3 complex * 3 = 9 The total actor weight for OPS = 13
140 WTH07
Analysis Class-Based Weighting Factors Use Case Type Description Simple Average Complex Fewer than 5 analysis classes 5 to 10 analysis classes More than 10 analysis classes
Factor 5 10 15
141 WTH07
Fill Order average Shipping Order simple Send Email simple So, 6 simple * 5 = 30 4 average * 10 = 40 0 complex * 0 = 0 Total use case weight for OPS = 30 + 40 + 0 = 70
142 WTH07
143 WTH07
Weight
2 1 1 1 1 0.5 0.5 2 1 1 1 1 1
144 WTH07
Rating the factors for National Widgets (say): 2 + 3 + 5 + 1 + 0 + 2 + 2 + 0 + 3 + 5 + 3 + 1 + 0 = 27 TCF = 0.6 + (0.01 * 27) = 0.87
146 WTH07
Weight
1.5 0.5 1 0.5 1 2 -1 -1
147 WTH07
Factor number & descriptions The value (extended) F1: Most of team (unfamiliar) 1.5 F2: Most of team (no application experience) 0.5 F3: Most of team (no OO experience) 1 F4: Lead analyst capability (good) 3 F5: Motivation (really eager) 5 F6: Stable requirements (not enough) 5 F7: Part-time workers (none) 0 F8: Difficult programming language Java (looking for) -1
148 WTH07
149 WTH07
The rating for OPS. Efactor = 1.5 + 0.5 + 1 + 3 + 5 + 5 + 0 1 = 15 EF = 1.4 + (-0.03 * 15) = 0.95 Use case points UCP = UUCP * TCF * EF The use case point for National Widget and the final estimation of time to complete the project is UCP = 83*0.87*0.95 = 67.6
150 WTH07
Project estimate.
Suppose we use a factor of 28 person-hours per UCP, then 67.6 * 28 1921 hours 46 weeks at 42 hours a week for one person
Note that the factor of 28 person-hours/UCP is because we have 2 negative factors in Environment Factors for Team and Weights. If you have 4 people in a team, there is not problems of communication or synchronization of effort. You may assume that all of the team member work full-time. So, about 11 months of effort plus 2 weeks for working out any team issues. The values of TF, EF, and factor of person-hours per UCP are different from every organization and estimated according to experience.
151 WTH07
152 WTH07
How and Who: the identification of tools, techniques and staff available.
153 WTH07
154 WTH07
Example: The importance of understanding the effects of tool use on productivity. Suppose that evaluating tool use is one of the major goals for a project. Several questions from the goals, including:
Which tools are used? Who is using tool? How much of the project affected by each tool? How much experience do developers have with the tool? What is productivity with tool use? Without tool use? What is the productivity quality with tool use? Without tool use?
155 WTH07
Example: Suppose that a project involves developing a complex piece of software, including a large database of sensor data. The sensor data capture routines are being written in C, while the data storage and manipulation routines are being written in Smalltalk. The project manager wants to track code productivity. His/her metrics plan includes a GQMderived question: Is the productivity for C development the same as for Smalltalk development? Productivity will be measured as size per person day. There are three ways: A count of objects and methods (operations), A count of lines of code, A count of function points. Thus, the goals tell us why we are measuring, and questions, metrics, and models tell us what to measure.
156 WTH07
157 WTH07
158 WTH07
Managed:
Quantitative process management Software quality management
Optimizing:
Defect prevention Technology change management Process change management
159 WTH07
160 WTH07
Focus
Process Areas
Continuous Organization innovation and process development, Casual Analysis improvement and resolution Quantitative management Process standardization Organization process performance Quantitative process management Requirements development, Technical solution, Product integration, Verification, Validation, Organization process focus, Organization process definition, Organization project management, Integrated supplier management, Risk management, Decision analysis and resolution, Organization Environment for integration, Integrated training Requirements management, project planning, Project Project monitoring and control, Supplier agreement management, Measurement and analysis, Process and product quality assurance, Configuration Mgmt
2 Managed
1 Performance
None
161 WTH07
XPs target is small to medium sized team (fewer than 10 people) building software with vague or rapidly changing requirements. The XP life cycle has four basic activities:
Continual communication with the customer and within team; Simplicity, achieved by a constant focus on minimalist solutions; Rapid feedback through mechanisms such as unit and functional test; and The courage to deal with problems proactively.
162 WTH07
163 WTH07
Pair programming: All production code is written by two programmers at one machine. Collective ownership: Anyone can improve any system code anywhere at any time. Continuous integration: Integrate and build the system many time a day when every time a task is finished). Continual regression when requirements change. 40-hour weeks: Work no more than 40 hours per week whenever possible; never work overtime two weeks in a row. On-site customer: Have an actual user on the team full-time to answer questions. Coding standards: Have rules that emphasize communication throughout the code.
164 WTH07
165 WTH07
3 3 3 3 3 3 4 4 5 5 5
Software product engineering Intergroup coordination Peer review Software product engineering Intergroup coordination Peer review Quantitative process management Software quality management Defect prevention Technology change management Process change management
++
++ ++ ++ ++ ++ --+ ---
Note: ++ Largely addressed in XP; + partially addressed in XP; -- Not addressed in XP.
166 WTH07
Note that CMM supports a range of implementations through 18 key process areas (KPAs) and 52 goals that comprise the requirements for a fully mature software process. If systems grow, some XP practices become more difficult to implement. XP is targeted toward small teams working on small to medium-sized projects.
167 WTH07
168 WTH07
Siemens metrics
Quality metrics Number of defects counted during code review, quality control, pilot test, fist year of customer installation / KLOC. Total number of defects received per fiscal year. Total number of field problem reports received. Development cost/# of KLOC. Maintenance cost/# of defects. Total gross lines of code delivered to customers/total staff months. KLOC/development effort in staff-months. KLOC/development time on months. Sales in DM/software development cost for FY and product line. Sales in DM/ total cost for development, maintenance and marketing for FY and product line.
Productivity metrics
Profitability metrics
170 WTH07
Hitachi Software Engineering (HSE): 98% of the projects were completed on time, and 99% of the project cost between 90 and 110% of the original estimate.
171 WTH07
172 WTH07
173 WTH07
People
174 WTH07
Implementation
175 WTH07
176 WTH07
As software become more pervasive and software quality more critical, measurement programs will become more necessary. Rubins report (1990): Among 300 major US IT companies (less 100 IT staff), sixty were successful by implemented measurement programs. Success means:
The measurement program results were actively used in decision making; The results were communicated and accepted outside of the IT department; The program lasted longer than two years.
177 WTH07
178 WTH07
179 WTH07
Lipow and Halsteads theory to define a relationship between fault density and size
d/L = A0 + A1 L + A2 L d is the number of faults, L is the size in LOC, and each Ai depends on the average number of uses of operators and operands per line of code for a particular language. For instance, for Fortran A0=0.0047, A1=0.0023 and A2=0.000043; for assembly 0.0012, 0.0001 and 0.000002, respectively.
Gaffney argued that the relationship between d and L was not language dependent, thus d = 4.2 + 0.0015 (L)4/3
180 WTH07
181 WTH07
11 Software Estimation
Note: The materials in this chapter are excerpted from [McConnell06]. If readers are interested in software estimation, we strongly recommend they may read McConnells book to obtain more solid knowledge about.
182 WTH07
183 WTH07
Project Size.
easily determining effort, cost, and schedule
Relationship between project size and productivity. (*)
Project Size (in LOC) 10K 100K 1M 10M LOC per Staff Year (COCOMO II nominal in parentheses 2,000-25,000 (3200) 1,000-20,000 (2600) 700-10,000 (2,000) 300-5,000 (1,600)
184 WTH07
Personnel Factors.
Personnel factors exert significant influence on project outcomes. According to COCOMO II: a 100,000 LOC project:
Best Rank Requirements analysis capability Programmer capability (general) Personnel continuity (turnover) Applications (business area) experience Language and tools experience Platform experience Team cohesion -29% -24% -19% -19% -16% -15% -14% Worst Rank 42% 34% 29% 22% 20% 19% 11%
Example: the project with worst requirements analysis would require 42% more effort than nominal.
185 WTH07
Note: If you dont have choice about the programming language, this point is not relevant to your estimate. Otherwise, using Java, C#, or VB would tend to be more productive than using C, Cobol, or Macro Assembly. 186 WTH07
Estimation on software projects interplays with business targets, commitments, and control.
Estimate: a prediction of how long a project will take and how much it will cost. Target: a statement of a desirable business objective.
Example: We must limit the cost of the next release to $2 million, because that is the maximum budget we have for the release.
Commitment: a promise to deliver defined functionality at a specific level of quality by a promised date. Control: typical activities are to remove non-critical requirements, to redefine requirements, to replace less-experience staff with moreexperience staff.
187 WTH07
188 WTH07
189 WTH07
Estimation error by software development activity (according to the figure in last slide) [Beohm00].
Scoping Error Possible Error on Low Side 0.25x (-75%) 0.50x (-50%) 0.67x (-33%) 0.80x (-20%) 0.90x (-10%) Possible Error on High Side 4.0x (+300%) 2.0x (+100%) 1.5x (+50%) 1.25 (+25%) 1.10x (+10%) Range of High to Low Estimates 16x 4x 2.25x 1.6x 1.2x
Phase Initial Concept Approved Product Definition Requirements Complete User Interface Design Complete Detailed Design Complete (for sequential projects)
190 WTH07
191 WTH07
Unstable Requirements.
The challenges of unstable requirements.
If requirements cannot be stabilized, estimate variability will remain high through the end of the project. Requirements changes are often not tracked and the project is often not reestimated.
So, in those cases, consider the development approaches that are designed to work in short iterations (agile methods), such as Scrum, Extreme Programming, time box development, etc.
Scrum: an agile method that is strong promotion of self-organizing teams, daily team measurement, and avoidance of following predefined steps. Key practices: a daily stand-up meeting with special questions, 30-calendar-day iterations, a demo to external stakeholders at the end of each iteration.
192 WTH07
12.1 Agility
z
Agility is dynamic, context-specific, aggressively change-embracing, and growth-oriented. (Goldman 1997). Agile process is both light and sufficient.
Lightness: staying maneuverable Sufficient: a matter of staying in the game, i.e., delivering software
194 WTH07
User stories.
A user story is a description of functionality that will be valuable to either a user or software or purchase of a system. Examples: Buying books through Internet(*)
A user can search for books by author, title or ISBN number. A user can view detailed information on a book. For example, number pages, publication date and contents. A user can put books into a shopping cart and buy them when she is done shopping. A user can remove books from her cart before completing an order. A user enters her billing address, the shipping address and credit card information. A user can establish an account that remembers shipping and billing information. A user can edit her account information (credit card, shipping address, billing address and so on).
(*) Mike Cohn, User Stories Applied for Agile Software Development, Addison-Wesley, Boston, 2004. 195 WTH07
197 WTH07
Story points.
Story point as an ideal day of work (that is, a day without interruption whatsoever). Story point as an ideal week of work Story point as a measure of the complexity of the story
198 WTH07
Estimate as a team
Gather together the customer and the developers who will participate in creating the estimates. Estimate, and converge on a single estimate that can be used for the story.
199 WTH07
200 WTH07
201 WTH07
Question: Assuming one-week iterations and a team of four developers, how many iterations will it take the team to complete a project with 27 story points if they have a velocity of 4?
Answer: With a velocity of 4 and 27 story points in the project, it will take the team 7 iteration to finish. (Note that the number of iteration is integer.)
202 WTH07
Responsibilities
Developers
Defining story points Giving honest estimates Estimating as a team All two-point stories should be similar.
Customer
Participating in estimation meetings Playing the role to answer questions and clarifying stories. Dont estimate stories yourself.
z
Question: If three programmer individually estimate the story at 2, 4, and five story point. Which estimate should they use?
Answer: They should continue discussing the story until their estimate get closer.
203 WTH07
204 WTH07
205 WTH07
The answer: Figure in last page + the cumulative story point graph:
The cumulative story point chart (above) shows the total number of story points completed through the end of each iteration.
206 WTH07
207 WTH07
The team actually complete 45-10-18=17, so still 113 story points remain. 208 WTH07
From the slope of the burndown line after the 1st iteration, the project would not be finished after 3 iterations.
209 WTH07
210 WTH07
Question: What conclusions should you draw from the following figure? Does the project like it will finish ahead, behind or on schedule?
Answer: The team started out a little better than anticipated in the first iteration. They expect velocity to improve in the second and third iterations and then stabilize. After two iterations they have already achieved the velocity they expected after three iterations. At this point they are ahead of schedule but you should be reluctant to draw too many firm conclusions after only two iterations.
211 WTH07
Question: What is the velocity of the team that finished the iteration shown in the following table?
Story Story 1 Story 2 Story 3 Story 4 Story 5 Story 6 Story 7 Velocity Story Points 4 3 5 3 2 4 2 23 Status Finished Finished Finished Half finished Finished Not started Finished
212 WTH07
Question: Complete the following table by writing the missing values into the table. Iter-1 Iter-2 Iter-3
Story points at start of iteration Complete during iteration Change estimate Story points from new stories Story points at end of iteration 100 35 5 6 76
40 -5 3
36 0 2
213 WTH07
Answer:
Iter-1 Story points at start of iteration Complete during iteration Change estimate Story points from new stories Story points at end of iteration 100 35 5 6 76 Iter-2 76 40 -5 3 34 Iter-3 34 36 0 2 0
214 WTH07
Estimate the user stories: estimate each new feature that has some reasonable possibility of being selected for inclusion in the upcoming release. Do in any sequence:
Select an iteration length: Mostly two or four weeks for most agile teams work. Estimate velocity: make an informed estimate of velocity based on past results. Prioritize user stories: prioritize the features the product owner wants to develop.
Select stories and a release date: estimate the teams velocity per iteration and assume the number of iteration.
Iterate until the conditions of satisfaction for the release can best be met.
215 WTH07
User stories are different from IEEE 830 software requirements specifications.
Documenting a systems requirements following IEEE 830 is tedious, error-prone, and very time-consuming.
User stories
emphasize verbal communication. are comprehensive by everyone. are the right size for planning. work for iterative development. encourage deferring detail. support opportunities design. encourage participatory design. build up tacit knowledge.
Some Comments
z
Martin Fowler and Kent Beck: Asking a developer for a percentage of completeness for a task generates a nearly meaningless answer.
The developers are often 90% complete in a matter of days, 95% complete in a month, 99% complete in six months, and leave the work 99.9% complete. As a manage what you can do! So, dont ask teams for a percentage of complete.
But, better ask the teams what percentage of the features or user stories complete they are.
Feature Driven Development (FDD) (*) uses the percentage of completeness of each feature to produce summary progress reposts.
218 WTH07
z z
z z
A user can search for books by author, title or ISBN number. A user can view detailed information on a book. For example, number of pages, publication date and a brief description. A user can put books into a shopping cart and buy them when she is done shopping. A user remove books from her cart before completing an order. To buy a book the user enters her billing address, the shipping address and credit card information. A user can rate and review books. A user can establish an account that remembers shipping and billing information. A user can edit her account information (credit card, shipping address, billing address and so on).
(*) [Cohn06]. 219 WTH07
z z z
z z
A user can put books into a wish list that is visible to other site visitors. A user can place an item from a wish list (even someone elses) into his or her shopping card. A repeat customer must be able to find one book and complete an order in less than 90 seconds. (Constraint) A user can view a history of all of his past orders. A user can easily re-purchase items when viewing past orders. The site always tells a shopper what the last 3 items she viewed are and provides links back to them. A user can see what books we recommend on a variety of topics. A user, especially a Non-Sailing Gift Buyer, can easily find the wish list of other users.
220 WTH07
z z
z z
z z
A user can choose to have items gift wrapped. A Report Viewer can see reports of daily purchases broken down by book category, traffic, best- and worst-selling books and so on. A user must be properly authenticated before viewing reports. Orders made on the website have to end up in the same other database as telephone orders. (Constraint) An administrator can add new books to the site. An administrator needs to approve or reject reviews before they are available on the site. An administrator can delete a book. An administrator can edit the information about an existing book.
221 WTH07
A user can check the status of her recent orders. If an order hasnt shipped, she can add or remove books, change the shipping method, the delivery address and the credit card. The system must support peak usage of up to 50 concurrent users. (Constraint)
222 WTH07
Plan by Feature
Design by Feature
Build by Feature
A design package Complete client(sequence) valued function (add more content to the object model)
223 WTH07
Feature.
A feature is a very specific, small, client valued function expressed in the form: <action>the<result><by|for|of|to>(an)<object>
Small: 1-10 days of effort are required to complete the feature. Most are 1-3 days. Client valued: the feature is relevant and has meaning to the business.
Examples:
Calculate the total of a sale. (a calculateTotal() operation in a Sale class) Validate the PIN number for a bank account. (a validate() operation on a BankAccount class) Authorize a loan for a custome. (a authorize() operation in a Customer class)
224 WTH07
Review Articles
R1: GQM Trees R2: Software Cost Estimation R3: Function Points R4: COCOMO Model R5: Putnam Model R6: Software Science Measurements
225 WTH07
226 WTH07
227 WTH07
228 WTH07
Mathematical models. The well-known models are the COCOMO effort model, Rayleigh curve models, and Albrechts function point models
229 WTH07
230 WTH07
Levels of complexity.
Item External input External output User inquiry External file Internal file Simple 3 4 3 7 5 Average 4 5 4 10 7 Complex 6 7 6 15 10
231 WTH07
The computation of TCF are completed using the experimentally derived formula:
where fi are detailed factors contributing to the overall notion of complexity. It ranges 0 to 5 with 0 being irrelevant and 5 standing for essential. So, TCF = 0.65 means all factors rated as irrelevant; =1.35 means all factors being essential.
z
233 WTH07
Weather Information
External inputs: none External outputs: 1 (update display) User inquiries: 1 (update request) External files: 2 (weather sensor, weather data) Internal files: 1 (logs) UFC = 1*7 + 1*6 + 2*15 + 1*10 = 53
If we consider adjusted point count FP, the range of possible values spreads from 34.45 (0.65*53) to 71.55 (1.35*53).
234 WTH07
235 WTH07
The COCOMO model is the most complete and thoroughly documented model used in effort estimate. It is based on Boehms analysis of a database of 63 software projects. There are 3 classes of systems:
Embedded. This class of systems is characterized by tight constraints, changing environment, and unfamiliar surroundings. Such as real-time software system, e.g., avionic, aerospace, medicine). Organic. This category have a stable environment, familiar surroundings, and relaxed interfaces. Such as simple business systems, data processing, small software libraries. Semidetached. The software systems falling under this category are a mix of those of organic and embedded nature. Such as operating systems, database management systems, and inventory management systems.
236 WTH07
237 WTH07
Maintenance effort
Effortmaintenance = ACT*Effort
ACT (annual change traffic) is a fraction of KDLOC undergoing change during the year.
238 WTH07
The Intermediate COCOMO Model: a refinement of the basic model. The Improvement comes in the form of 15 attributes of the product. Rating the 15 attributes by using the following six point scale:
VL (very low) LO (low) NM (nominal) HI (high) VH (very high) XH (extra high)
239 WTH07
Personnel attributes
Analysis capability (ACAP). Application experience (AEXP), language experience (LEXP), and virtual machine experience (VEXP).
Computer attributes
Execution time (TIME) and memory (STOR) constraints. Virtual machine volatility (VIRT). Development turnaround time (TURN).
Project attributes
Modern development practices (MODP). Use of software tool (TOOL). Schedule effects (SCED)
240 WTH07
LO
0.88 0.94 0.85
NM
1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00 1.00
HI
1.15 1.08 1.15 1.11 1.06 1.15 1.07 0.86 0.91 0.86 0.95 0.90 0.91 0.91 1.04
VH
1.40 1.16 1.30 1.30 1.21 1.30 1.15 0.71 0.82 0.70
XH
0.87 0.87 1.19 1.13 1.17 1.07 1.10 1.10 1.10 1.08
Depending upon the product, each attribute is rated and these partial results are multiplied giving rise to the final product multiplier (P). The effort formula is expressed as follows: Effort = Effortnom*P where Effortnom arises in the following form: Effortnom = 2.8*KDLOC1.20 for embedded system Effortnom =2.8*KDLOC1.05 for organic system Effortnom =2.8*KDLOC1.12 for semidetached system The support effort is calculated using the following formula: Effortmaintenance = ACT*Effortnom*P
242 WTH07
Example
Suppose a software with an estimated size of 300 KDLOC. The software is a part of control system of a smart vehicle initiative. The system collects the readings from various sensor, process them, and develop a schedule of pertinent control actions. This is an embedded system. The basic form of the cost estimation model leads to the person-month effort as Effort = 3.6*3001.20=3379 person-month, and development time M=2.5*33790.32=33.66 months.
243 WTH07
The scaling factor P = 1.6095. The nominal effort is equal 2.8*3001.20=2628 person-months. The modified result is 2628*1.6095=4229 person-months.
244 WTH07
Manpower Loading
245 WTH07
a = (1/2t2), a shape parameter for distribution, e.g., dy/dt is a maximum when t = td, where the time td at which the average teamsize is maximum; k = the area under the curve and has the dimensions of effort. In the curve, the left of td is the effort (40%) for a software specification and development, while the right of td is the maintenance effort (60%) required after delivery of the software.
246 WTH07
Software Equation:
Ss = ck k x t y Ss is the software size in source code statements; ck is a constant of proportionality that can be correlated with the degree of sufficient of a technical environment for a type connected effort, i.e., on-line interactive development, structural coding, less fuzzy requirements, machine access constraints, etc. k is the area under the Rayleigh curve, representing effort; its exponent here is x = 1/3. t is the time, and its exponent is y = 4/3.
247 WTH07
given that for a particular task and environment Ss and ck can be regarded as properties of that task, and therefore constants.
E T4 = constant
z
This equation expresses the underlying relationship between effort and time-scale for software development. So, small incremental or decremental changes to time will result in rather large concomitants in effort.
248 WTH07
Example. Suppose a project with the estimate of effort about 25 person-year, over 2 elapsed year of time. If it allows the duration reduced to 18 months, the predicable effects of this, according to Putnams derivation:
The intrinsic property of this estimate is given 25 x 24 = constant = 400, say. In the new circumstance, effort x (1.5)4 = constant = 400 It follows then that a new value of effort is required, and this may be computed from E = 400/(1.5)4 = 79.6 person-years In other word, a 25% decrease in time-scale has led to an increase of 216% in effort required.
249 WTH07
Operators Occurrences Operands Occurrences 1 N1 2 public void () ; int ++ [] {} for = + > 2 2 2 6 3 2 3 4 2 2 2 1 paint Graphics sort 30 60 a 1 pass a.length i hold g 0 print
N2 1 1 1 3 1 5 4 3 2 7 2 2 1 2
1 public void paint (Graphics g) 2 { 3 print (g, Sequence in original order, a, 30, 30); 4 sort ( ); 5 print (g, results, a, 30, 60); 6 } 7 public void sort ( ) 8 { 9 for (int pass=1; pass < a.length; pass++) 10 for (int i = 0; i < a.length; i++) 11 if (a[i] > a[i+1]) 12. {hold=a[i]; 13. a[i]=a[i+1]; 14. a[i+1]=hold; 15. } 16 }
1=12 2=14
N1=31 N2=35
250 WTH07
Program length Program volume Potential volume Program level Difficulty Effort and Time
z z
z z
N = 1ln 1 + 2ln 2 = 12 ln 12 + 14 ln 14 = 96.32 V = N ln = 456 bits V* = (2 + 2*) ln (2 + 2*) = (2 + 3) ln (2 + 3) = 11.61 L = V*/V (if V = V*, L = 1. In general, V > V*) = 0.025 D = 1/L = 40 L = (2/1) x (2 /N2) = 0.068 E = V/L = 6706 T = E/ (=[5,20] Stroud number) = 373 sec = 7 min
251 WTH07
References
z
[Boehm00] Boehm, Barry, et al., Software Cost Estimation with Cocomo II, AddisonWesley, Reading, MA., 2000. [Cohn06] Cohn, Mike, Agile Estimating and Planning, Prentice Hall Professional, Upper Saddle River, NH, 2006. [Fenton97] Fenton, Norman E. and Shari Lawrence Pfleeger, Software Metrics: A Rigorous & Practical Approach, PWS, Boston, 1997. [Lorenz94] Lorenz, Mark, and Jeff Kidd, Object-Oriented Software Metrics, PTR Prentice Hall, Englewood Cliffs, NJ, 1994. [McConnell06] McConnell, Steve, Software Estimation, Microsoft Press, Redmond, WS, 2006. [Mller93] Mller, K. H., D. J. Paulsh, Software Metrics: A Practitioners Guide to Improved Product Development, IEEE Press, London, 1993. [Palmer02] Palmer, Stephen and John M. Felsing, A Practical Guide to FeatureDriven Development, Prentice Hall PTR, Upper Saddle River, NJ., 2002. [Putnam78] Putnam, Lawrence R., A General Empirical Solution to the Macro Software Sizing and Estimating Problem, IEEE Trans. On Software Engineering, Vol. SE-4, No. 4, 1978, pp. 345-361.
252 WTH07
[Schneider01] Schneider, Geri, and Jason P. Winters, Applying Use Cases: A Practical Guide, 2nd Edition, Addison-Wesley, Boston 2001. [Papers] Papers from IEEE Transaction on Software Engineering, IEEE Software, IEEE Computer, CACM, and JOOP.
253 WTH07