Você está na página 1de 187

Software Development Process

Enhancement: a Framework for


Decision-Making Support








Department of Computer Science and Civil Engineering

Ph.D. in Computer Science, Control and Geoinformation
Ph.D. Program Coordinator: Giovanni Schiavon

2010-201
Giovanni Cantone
!dvi"or
Forrest Shull
Co-!dvi"or
Manuel Mastrofini
Ph.D. Candidate
2 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g
























Page intentionally left lank!
3 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Acknowledgements
"t the very eginning# $ thought it would have een fun writing this part of the thesis! %ow that
$ am doing it# $ actually elieve it is a good thing that everyone who collaorated to this
achievement e aware and can receive the due recognition! &his way# everyone will share his or
her own responsiilities and will not e allowed to complain that it does not work# it is not
clear# it is oring and lalala! 'owever# do not try to get too much credit: $ did it! $t is my
treasure! (ou )ust somehow contriuted!

"lso# you may rememer that in my Master thesis# other than a few typos# $ mentioned
everyone# name y name! &his time $ decided to e slightly less e*plicit and much more concise#
so let+s start!

Everything $ have done# $ dedicate it to my family! My e*tended family# the ones $ love! $ hope
they are proud of me# as this would make me proud of myself! "nd happy!
$ owe a profound and sincere gratitude to Prof! ,iovanni -antone# carrot and stick and
continuous encouragement# as well as to Dr! Forrest Shull# illuminating guide!
$ still strongly elieve that# without friends# life would e very sad and ugly! $ am lucky# ecause
$ have some of these friends that still walk on my side# as $ walk on their side# so that we are not
alone when it is stormy where we are!

"s of today# challenges keep coming# ut $ am steady! So# please# stay with me# do not leave me
alone# and $ will keep eing steady and will keep walking on your side! "s a reward# $ will thank
you also in my ne*t acknowledgment section .which is going to e something that you really do
not e*pect/ and $ might even offer you some good cake or a eer# as soon as $ manage to get
some free time!

&hanks!



Science demand" nothing le"" than the fervent and unconditional dedication of our entire live".
4 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Table of Contents
!#"tract$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$..%
Structure of the the"i"$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$..&

SECTION I: BEHIND THE CURTAIN
CHAPTER I. ECISI!" MA#ERS I" THE FIE$ !F S!FT%ARE E"GI"EERI"G& THE GPS METAPH!R ............... '(
CHAPTER II. THE R!$E !F EMPIRICA$ S!FT%ARE E"GI"EERI"G ................................................................ ''
CHAPTER III. C!"TRI)*TI!" !F THIS %!R#& THE IMPACT +ECT!R M!E$................................................. '3
CHAPTER I+. C!"CEPT*A$ ARCHITECT*RE !F THE M!E$ ......................................................................... '4

SECTION II: IMPACT VECTOR MODEL
CHAPTER +. SHAPI"G THE M!E$ ............................................................................................................ ',
0!$! "SS1MP&$2%S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 34
0!$$! -'"5"-&E5$S&$-S 2F &'E M2DE6 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 37
0!$$$! &"82%2M( !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 37
CHAPTER +I. MATHEMATICA$ F!*"ATI!" .............................................................................................. 2(
0$!$! F25M16"&$2% !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 9:
0$!$$! 1S$%, &'E M2DE6 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 99
CHAPTER +II. ARESSI"G THE !PE" PR!)$EMS !F THE IMPACT +ECT!R M!E$ ...................................... 2-
0$$!$! "--21%&$%, F25 P"5"66E6 P52-ESS "-&$0$&$ES !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 9;
0$$!$$! $MP"-& 0E-&25 -2%S&5"$%&S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 9<
0$$!$$$! EMP$5$-"6 DEF$%$&$2% 2F &'E "-&$0$&( PE5F25M"%-E F1%-&$2% "%D -2MB$%"&$2% F1%-&$2% !!!!!!!!!!!!!!!!!!!!!!!!!!!! 94

SECTION III: IMPACT VECTOR MODEL:
LEVERAGING THE EXCELLENCE
CHAPTER +III. %HERE ! IMPACT +ECT!RS C!ME FR!M. .......................................................................... 32
CHAPTER I/. E"A)$I"G +A$*E0)ASE ECISI!"S +IA THE IMPACT +ECT!R M!E$ ................................... 32
CHAPTER /. 1IP A" G1M& C!"TI"*!*S PR!CESS IMPR!+EME"T +IA IMPACT +ECT!RS ........................ 32
8!$! =$P: =1"6$&( $MP520EME%& P"5"D$,M !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! >;
X.II. ,=M?S: ,2"6# =1ES&$2%# ME&5$- ? S&5"&E,$ES !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ><
8!$$$! M2DE6$%, -'2$-ES "%D "6&E5%"&$0ES @$&' $MP"-& 0E-&25S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! >7
CHAPTER /I. IMPACT +ECT!R M!E$& P!I"TS !F C!"TACT %ITH CMMI3 IS!4IEC '--(4 A" SI/ SIGMA .... 4'

SECTION IV: IMPACT VECTOR FRAMEWORK
CHAPTER /II. IMPACT +ECT!R SCHEMA& !RGA"I5E A" ST!RE IMPACT +ECT!RS ...................................... 4,
8$$!$! $MP"-& 0E-&25 S-'EM": M"$% 5E=1$5EME%&S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! A7
8$$!$$! $MP"-& 0E-&25 S-'EM": "5-'$&E-&15E "%D &E-'%262,( !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! B:
- | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

CHAPTER /III. *SI"G I)M C!G"!S I"SIGHT6 T! +IS*A$I5E3 E/P$!RE3 A" A"A$75E IMPACT +ECT!RS ....... -2
CHAPTER /I+. I)M METH! C!MP!SER6 8 I)M TEAM C!"CERT6& I"F*SE3 ESTIMATE3 MA"AGE3 MEAS*RE3
A" IMPR!+E %ITH IMPACT +ECT!RS .......................................................................................................... -4

SECTION V: IMPACT VECTOR MODEL:
EXPERIMENTATION
CHAPTER /+. IMPACT0!"E& +IRT*!*S C!MPA"7 $E+ERAGI"G THE IMPACT +ECT!R FRAME%!R# ............ -,
80!$! F$5S& S&EPS F25 "PP6($%, &'E $MP"-& 0E-&25 M2DE6: -2%&E8& !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! B4
80!$$! F$5S& S&EPS: "-&$0$&( $DE%&$F$-"&$2%# DEF$%$&$2%# "%D -'"5"-&E5$C"&$2% !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ;3
CHAPTER /+I. TAI$!RI"G THE I)M RATI!"A$ HARM!"7 F!R EM)EE S!FT%ARE .................................. 23
80$!$! P52-ESS DES-5$P&$2% !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ;>
'().i.1. Proce"" De"cription: an E*ample ....................................................................................................... +%
80$!$$! 526ES $% 215 &"$625ED '"5M2%( P52-ESS !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! ;4
'().ii.1. ,apping -et.een /ole" and !ctivitie" ............................................................................................. %0
80$!$$$! DES-5$P&$2% 2F P52-ESS "-&$0$&$ES !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! <:
'().iii.1. !nal01e Sta2eholder /e3ue"t" ........................................................................................................... %0
'().iii.2. Define S0"tem (alidation Plan ........................................................................................................... %2
'().iii.. Specif0 /e3uirement" and 4racea#ilit0 to Sta2eholder /e3ue"t" ...................................................... %
'().iii.5. /evie." and /i"2 !nal0"i" ................................................................................................................. %5
'().iii.6. /evie. of /e3uirement" .................................................................................................................... %6
'().iii.+. /evie. of -lac2 -o* 7"e Ca"e De"ign, S0"tem 4e"t Plan, and (erification /eport" .......................... %6
'().iii.%. /evie. of !rchitecture, 8hite -o* 7"e Ca"e De"ign, S0"tem )ntegration 4e"t Plan, and /eport" .... %+
'().iii.&. /evie. 4racea#ilit0 ,atrice" ............................................................................................................. %%
'().iii.9. !nal0"i" of /i"2" ................................................................................................................................. %&
'().iii.10. )dentif0 and 4race 7"e-ca"e" to /e3uirement" ............................................................................. %&
'().iii.11. -lac2 -o* 7"e Ca"e De"ign ............................................................................................................ &0
'().iii.12. Define and 4race S0"tem, S0"tem )ntegration, and 7nit 4e"t Plan" .............................................. &5
'().iii.1. (erif0 -lac2--o* and 8hite -o* 7"e Ca"e De"ign through E*ecution ........................................... &6
'().iii.15. Define S0"tem !rchitecture ........................................................................................................... &%
'().iii.16. 8hite -o* 7"e Ca"e De"ign ........................................................................................................... 90
'().iii.1+. Develop or Generate 7nit" and 4race to 8hite -o* 7"e Ca"e De"ign .......................................... 92
'().iii.1%. Perform 7nit 4e"t, S0"tem )ntegration 4e"t, and S0"tem 4e"t ...................................................... 9
'().iii.1&. Perform S0"tem (alidation............................................................................................................ 9
'().iii.19. :urther !ctivitie" ........................................................................................................................... 95
80$!$0! "5&$F"-&S "%D &226S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 7A
'().iv.1. Chec2li"t": a Support 4ool for /evie." .............................................................................................. 9+
'().iv.2. Contri#ution of !gile ,odel" to the Propo"ed Development Proce"" ............................................... 99
80$!0! 5E=1$5EME%&S E%,$%EE5$%, D$S-$P6$%E P52P2S"6 !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 77
80$!0$! "%"6(S$S "%D P"-D",$%, 2F 5ES16&S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3:9
CHAPTER /+II. TESTI"G ACTI+ITIES& A C!MP$ETE E/AMP$E !F ACTI+IT7 CHARACTERI5ATI!"...................... '(9
CHAPTER /+III. SE1*E"CI"G +ERIFICATI!" TECH"I1*ES& A" E/PERIME"T ................................................. ''-
'())).i.1. )ntroduction ................................................................................................................................. 116
'())).i.2. /elated 8or2 .............................................................................................................................. 11%
2 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

'())).i.. E*periment De"cription and De"ign ............................................................................................ 11&
'())).i.5. /e"ult !nal0"i"............................................................................................................................. 12
'())).i.6. /e"ult" )nterpretation ................................................................................................................. 12+
'())).i.+. Conclu"ion ................................................................................................................................... 12%
CHAPTER /I/. TEST PERF!RMA"CE E+A$*ATI!" ....................................................................................... '2,
8$8!$! "%"6(S$S 2F 5ES16&S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 397
CHAPTER //. SE$ECTI"G TECH"!$!GIES F!R I"CREASI"G PR!*CTI+IT7& A" E/AMP$E %ITH M!)I$E
TECH"!$!GIES '32
88!$! $%&52D1-&$2%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3>;
88!$$! P15P2SES 2F &'E S&1D( "%D 5ESE"5-' ME&'2DS !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3><
88!$$$! -'"5"-&E5$C$%, &'E &E-'%262,$ES &2 S&1D( !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3>4
''.iii.1. ;4,<6 = CSS = >avaScript ?D41@ .................................................................................................... 150
''.iii.2. Platform Specific Soft.are Development Ait ?D42,@ ...................................................................... 150
''.iii.. !ppceleratorB 4itanium ,o#ile ?1.&.1@ ?D45@ .................................................................................. 150
''.iii.5. !do#eB :le* ,o#ile ?5.6@ = !do#e !ir ?D46@.................................................................................... 150
''.iii.6. PhoneGap ?1.%.0@ ?D4+, %, &@ ........................................................................................................... 150
88!$0! D5$0E5S F25 &E-'%262,( SE6E-&$2% !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3A3
XX.V. " M2DE6 F25 E0"61"&$%, M2B$6E &E-'%262,( !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3A9
XX.v.1. )mplementation and Graphical (ie. of the ,odel ......................................................................... 15
88!0$! &'E P52-ESS 2F &E-'%262,( DE-$S$2% M"D$%, F25 M2B$6E !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3A;
88!0$$! -"SE S&1D( !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3A<
88!0$$$! 0"6$D$&( $SS1ES# S261&$2% 6$M$&"&$2%S# "%D 6ESS2% 6E"5%ED !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3A7
CHAPTER //I. E/P$!ITI"G S!FT%ARE I"SPECTI!" ATA F!R EAR$7 EFECT REM!+A$ ............................. '-(
88$!$! $%SPE-&$2%S "& $MP"-&-2%E: S&"&E 2F &'E P5"-&$-E !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3B9
88$!$$! DE-$S$2% M2DE6: S&"&$S&$-"6 "%"6(S$S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3B;
'').ii.1. ,etric" ............................................................................................................................................. 16+
'').ii.2. Stati"tical 4e"t" ................................................................................................................................ 16+
88$!$$$! DE-$S$2% M2DE6: $%&E5P5E&"&$2% 2F 5ES16&S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3B4
88$!$0! &'5E"&S !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 3;:
CHAPTER //II. !+ERA$$ E/PERIME"TATI!" THREATS ................................................................................ '2'

SECTION VI: FINAL THOUGHTS AND
FUTURE WORK
CHAPTER //III. FI"A$ TH!*GHTS ................................................................................................................ '24
CHAPTER //I+. F*T*RE %!R# .................................................................................................................... '24
'')(.i.1. Con"olidating the )mpact (ector :rame.or2 .............................................................................. 1+5
'')(.i.2. )mpact (ector ,odel and Empirical Soft.are Engineering 2.0 .................................................. 1+6
'')(.i.. -u"ine"" !nal0tic" and the )mpact (ector :rame.or2 ............................................................... 1+6
'')(.i.5. :urther Conte*t" of !pplication of the )mpact (ector :rame.or2 ............................................. 1++

9 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Abstract
-!CAG/C7DD. Soft.are development i" a mi*ture of technical, human and conte*tual factor",
.hich ma2e it comple* and hard to manage. !" in ph0"ic", medicine and "ocial "cience",
e*perimentation i" the fundamental .a0 for Soft.are Engineering to under"tand, predict and
co"t-effectivel0 control the "oft.are development proce"". )n thi" the"i" .e deal .ith the
improvement of the "oft.are development proce"", .hich ma0 depend on a remar2a#le num#er
of varia#le", even do1en"E ho.ever, the currentl0 availa#le improvement model" con"ider onl0
"mall num#er" of varia#le".
CCD4E'4. 4he pre"ent .or2 i" a PhD the"i" conceived and reali1ed #et.een 2009 and 201 and
developed partl0 in )tal0 and partl0 in the 7nited State", .ith the contri#ution" of the 7niver"it0
of /ome 4or (ergata, the :raunhofer Center for E*perimental Soft.are Engineering - ,ar0land,
the Dational !eronautic" and Space !dmini"tration ?D!S!@ and ,-D! )talia.
GC!<. 4he purpo"e of thi" "tud0 i" to under"tand, predict and control the "oft.are development
proce"" .ith focu" on performance - modeled a" a "et of varia#le" of the nece""ar0 cardinalit0, -
from the point of vie. of deci"ion ma2er", in the conte*t of companie" and agencie" .illing to
achieve their #u"ine"" goal" via a rigorou", predicta#le and engineered approach to "oft.are
development. )n other .ord", .e aim to provide deci"ion ma2er" .ith a coherent and
encompa""ing "et of model" and tool" to "upport their Soft.are Engineering Fo#.
,E4;CD. )n order to achieve thi" goal, .e fir"t need to define a theoretical foundation for our
approachE then .e need to validate "uch theoretical model, "o to refine and complete it"
formulation. Su#"e3uentl0, .e need to ena#le deci"ion ma2er" to u"e and leverage the validated
model in practice.
/ES7<4S. 4he main re"ult of thi" the"i" i" a frame.or2, tailored to #oth re"earcher"G and
practitioner"G need", to "upport the e*perimental inve"tigation, and con"e3uent improvement,
of the "oft.are development proce"" of organi1ation" #0 detecting, modeling, and aggregating
a" man0 varia#le" a" nece""ar0 ?up to ninet0-four in our e*perience@, and eventuall0 u"e them to
#uild mea"urement model". )n thi" .or2, .e report on the theoretical foundation of our
frame.or2 and evidence of it" "ucce""ful preliminar0 validation on the field, .ith different
organi1ation", including three maFor international companie", each having diver"e goal" and
con"traint" in place. Specificall0, the output" of thi" "tud0 con"i"t in a "et of tailora#le and
u"a#le method" to help e*pert" ma2e deci"ion" concerning "oft.are development, along .ith a
"ignificant "erie" of e*perience" to "ho. it" functionalit0 and .ith "ome recommendation" on
ho. to turn the frame.or2 into practice.

, | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Structure of the hesis
$n order to detail the roadmap to achieve the stated goal# this work has een structured as
follow:
3! Section $ tells the ,loal Positioning System .,PS/ metaphor# i!e!# a soft introduction to
the standpoint the reader is e*pected to assume while approaching this work!
9! Section $$ provides the mathematical formulation of the )mpact (ector ,odel# i!e! the
model that we propose to enale meeting the stated goal! &his mathematical
formulation also includes definitions and assumptions which conte*tualiEe and scope
the model! Eventually# some open prolems of the model are pointed out and
addressed!
>! Section $$$ consolidates the model foundation y linking the $mpact 0ector Model to
some pre-e*isting works which represent right achievements in the history of Software
Engineering! Such foundation includes: the concepts of the (alue--a"ed Soft.are
EngineeringE the Goal, Hue"tion, ,etric plu" Strateg0 paradigmF the Hualit0
)mprovement ParadigmE C,,): )SC "tandard"! Furthermore# a Guick# ut road analysis
of software .and system/ development process models is provided!
A! Section $0 packages the $mpact 0ector Model into a framework y providing a tool chain
for supporting the model infusion in practice!
B! Section 0 focuses on the validation of the $mpact 0ector Model! @e propose the
application of impact vectors to a hypothetical HsyntheticI company# which astracts on
and consolidates e*periences that we gained y working with some real organiEations
worldwide across several years! $n fact# data from trenches are precious and su)ect to
restrictionF so# we merge fiction and reality in such a synthetic company: while telling
true stories of application of the $mpact 0ector Model# we suvert the outcomes of our
studies! &his way# we can illustrate Guite real e*amples of application of our model#
along with related limits and validity threats# while assuring the confidentiality of results
and preserving sensitive data from eing pulished!
;! Section 0$ reports some final thoughts and anticipates some future developments!
"ppendices and 5eferences conclude the dissertation!

Since the structure of the thesis is top-down .first the conte*t# second the theoretical model#
and last model validation/# a recommendation for the reader is not to linger on the formalisms
of Section $$# ecause a shallow understanding is enough to e ale to easily follow and en)oy
the ne*t Sections! Eventually# a second reading of the theory will e faster and easier!

: | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


I!" our circle of 2no.ledge e*pand", "o doe"
the circumference of dar2ne"" "urrounding itJ.
!l#ert Ein"tein
Section I: Section I: Section I: Section I:
Behin the C!"t#in Behin the C!"t#in Behin the C!"t#in Behin the C!"t#in
'( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Chapter I. Decision Makers in the Field of Software
Engineering: the GPS Metaphor
$n our daily lives# we use many tools and devices which support us in making choices! @hen
such instruments are not availale# we have to rely on our sensiility and intuition to undertake
our decisions! "n e*ample of such instruments is a ,PS .,loal Positioning System/ device with
navigation software: given our current position and our destination# it can figure out the way to
get there!
"n interesting aspect of ,PS devices is that# at the very eginning of the trip# they provide the
driver with an estimation of distance and drive time! &he distance is often very precise# as its
calculation is ased pretty much on certain data: satellite triangulation has a high resolution
and# if the road model is accurate enough# then the computation of the distance is accurate as
well! 'owever# the time estimation is harder to provide! $n fact# many variation factors can
affect it# including: traffic# traffic lights# car characteristics# road conditions# driving style# stops#
and une*pected events! &herefore# as long as you proceed in your trip# the time estimation may
need to e ad)usted!
6et us go a it more in detail and give you a couple of e*amples! For a trip of >: miles on the
1S-7B %J0eterans Memorial 'ighway# East of the Death 0alley Park and 3A: miles north of 6as
0egas K asically# in the middle of nowhere K# the time estimation can e very accurate: factors
like traffic# traffic lights and stops can e easily and reasonaility ignoredF therefore# assuming
an average speed and that everything will e alright# the ,PS makes life easy to a father
answering his kid+s Guestion: Hare we there yetLI! &he estimation of the ,PS is very likely to e
satisfying and appropriate: H>: minutesI!
$ challenge you to use the same ,PS# which worked perfectly in %evada# for a trip of 3: miles on
the crowded ,reat 5ing 5oad around 5ome K or any of your HfavoriteI roads you drive on every
day to go to work K at around 4:>: PM! @e do not want to put salt in your .and our/ wound#
ut )ust point out that the estimation of the ,PS can e highly ineffective in some conditions# if
it accounts only for distance and average speed! $n this situation# ,PS devices should account
for additional aspects# in particular real-time traffic condition# accidents and road workF and
some ,PS devices do# so that drivers can keep monitoring the trend of their trips and# maye#
they can notify their oss that they will e late for the planned meeting! So# the ,PS may not fi*
the prolem# ut it can still e very useful to understand what is going on and# possily# to take
some precautions and countermeasures! "lso# some ,PS devices are adaptive and change the
road accordingly to some conditions# so they can get even smarter and try to mitigate the
prolem# ut we do not want to discuss this e*ample any deeper!
'' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

2nce that the ,PS story has een told# let us )ump on our eloved field of Software
Engineering! During our e*perience# we realiEed that many large scale pro)ects are much closer
to the H5omeI e*ample than to the HDeath 0alleyI e*ample! 1ncountale factors must e
considered y the decision makers# innumerale things can go wrong and accounting all
possile attriutes influencing the development is a utopia! 'owever# if we are willing to accept
to live with some degree of uncertainty# then we could e happy with having something like a
,PS# providing some form of support to our development and some clue on what is happening
with the pro)ect! Maye# this sort of ,PS for software development could e even capale to
provide us with some advice on Hroad ad)ustmentsI!
Pro)ect management Guestions should e addressed y such hypothetical device# e!g!: Hwhat
testing techniGue is the most appropriate now# given my pro)ect status and goalsLI or Hcan we
do this task within the planned udgetLI or Hhow can we assure that the Mars 5over will land
the planet and will accomplish its mission with great success for the entire mankindLI!
&he answer to any of those Guestions can e very hard to provide and we do not want )ust a
trivial HyesI or Hvery soonI# even though it could make everyone happy .for a while# at least/!
'ow can we answer# thenL
Chapter II. The Role of Empirical Software Engineering
Empirical is an ad)ective denoting something derived or guided y e*perience# case studies or
e*periments! "n empirical method# then# is an approach ased on the collection of data to ase
a theory or derive scientifically valid conclusions! "lso the notorious scientific method# y
,alileo ,alilei# founds on empirical methods and it is a stronghold in every field of science!
$n the domain of Software Engineering .SE/# the Empirical Software Engineering research field
has emphasiEed the use of empirical methods .e!g! e*periments# case studies# surveys and
statistical analysis/ to accumulate knowledge# in order to advance the state-of-art and the
state-of-practice of SE!
IProgre"" in an0 di"cipline depend" on our a#ilit0 to under"tand the
#a"ic unit" nece""ar0 to "olve a pro#lem. )t involve" the #uilding of
model" K$L.8e need to #e a#le to model variou" product characteri"tic"
K$L, a" .ell a" model variou" proFect characteri"tic"J.
(ictor /o#ert -a"ili
'2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n 377;# in the Editorial of the first volume of the HEmpirical Software EngineeringI )ournal#
0ictor 5! Basili proposed a vision that# after 3B years of inspiration for great research in SE# also
inspired the $mpact 0ector Model that will e presented in this work! $nstead of proposing a
synthesis of Basili+s vision# some e*cerpts are reported# which will settle the most appropriate
standpoint for approaching ne*t chapters!
MK$L ph0"ic", medicine, manufacturing. 8hat do the"e field" have in commonN 4he0 evolved
a" di"cipline" .hen the0 #egan learning #0 appl0ing the c0cle of o#"ervation, theor0
formulation, and e*perimentation. )n mo"t ca"e", the0 #egan .ith o#"ervation and the
recording of .hat .a" o#"erved in theorie" or "pecific model". 4he0 then evolved to
manipulating the varia#le" and "tud0ing the effect" of change in the varia#le".
;o. doe" the paradigm differ for the"e field"N 4he difference" lie in the o#Fect" the0 "tud0,
the propertie" of tho"e o#Fect", the propertie" of the "0"tem" that contain them and the
relation"hip of the o#Fect" to the "0"tem". So difference" e*i"t in ho. the theorie" are
formulated, ho. model" are #uilt, and ho. "tudie" are performedE often affecting the detail"
of the re"earch method".
Soft.are engineering ha" thing" in common .ith each of the"e other di"cipline" and "everal
difference". K$L li2e other di"cipline", "oft.are engineering re3uire" the c0cle of model
#uilding, e*perimentation, and learningE the #elief that "oft.are engineering re3uire"
empirical "tud0 a" one of it" component".O
"out 3B years later# the goal of predictale Software Engineering has clearly not een
achieved yet and this work is a contriution towards that o)ective!
For skeptics and detractors of the feasiility of this goal# the following# additional thought#
coming from the same source# might e illuminating!
M)t i" hard to #uild model" and verif0 them via e*periment" - a" .ith medicine. !" .ith the
other di"cipline", there are a large num#er of varia#le" that cau"e difference" and their
effect" need to #e "tudied and under"tood. M
&herefore# if we can attempt summariEing# SE is hard and complicated# ut not more than many
other sciences! "nd if we want to keep naming it HEngineeringI# then we need to go ahead and
uild our models and theories# supported y e*perimentation# so to make software
development more and more manageale!

'3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Chapter III. Contribtion of This !ork: the "mpact
#ector Model
$n very mature fields# scientists and engineers uild models#
according to some theories# in order to support the resolution
of prolems of the real world! Solving prolems implies making
choices# among some e*isting alternatives# on the way the
prolem should e solved! &he choices that a decision maker
picks are unargualy crucial# as they can determine the failure
of the pro)ectF maye choices cannot always determine the
success of the pro)ect# which can depend also on e*ternal and
uncontrollale factors# ut we can e fairly sure that ad
choices are likely to lead to ad failures!
&he model we want to uild should help make such choices# according to the set usiness goals
and constraints! @e name this model the H$mpact 0ector ModelI!
Several attempts of supporting decision makers# at different levels# e*ist in the history of
Software Engineer# e!g!: -o-oMo for estimating software cost and supporting resource
allocationF ,=M approach for measuring software development and support team managers
understand where to improve itF architectural# design and implementation patterns to estalish
a set of reusale solutions to help developers uild good software! &hese achievements share
the idea of making the world of software development more understandale# manageale#
engineered and predictale# while keeping the fle*iility and tailoraility that software
development reGuires!
&he step further we propose in this work consists in formaliEing the concept of choice in our
thinking# e it a technological choice .e!g! a cutting-edge technology rather than a well-proven
technology/# a process choice .e!g! performing a code inspection rather than using pair
programming/ or a pro)ect choice .e!g! acGuiring a -2&S and a consultant rather than
developing in-house/!
@e do not e*pect to finaliEe the construction of a general model# to apply in all possile
conte*ts and for all possile choices! @e rather e*pect to propose a way of organiEing#
representing# accessing and leveraging knowledge gathered e*perimentally in the course of
decades via research and on-the-field e*periences! &his way# our model could e instantiated in
each single conte*t# where it will e leveraging conte*t-specific data and will propose solutions
tailored down to the conte*t itself!
'4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"lso# our model is thought to e suitale for e*tension and refinement via long-term#
multifaceted and collaorative work# where e*perienced professionals and far-looking scientists
can )oin their efforts to leverage# augment and improve it# without reaking what has already
een put in place# possily e*ploiting and sharing the same data and repositories!
Chapter IV. Conceptal $rchitectre of the Model
So far# we have presented many good intentions# ut nothing really practical has e*isted# yet!
Section $$ will fi* this issue# ut first readers may like to have a high-level understanding of how
this model is e*pected to work! &herefore# a conceptual view of the model is provided in this
chapter# identifying inputs# outputs# actors and the general idea of the model functionality!
-onsider a typical industrial conte*t# where the decision maker is the one responsile for a new
pro)ect that the company has to lead off! Management sets goals and constraints for the
pro)ect and# ased on some conte*tual factors# the decision maker has to make choices! Such
choices# according to constraints and goals# are made ased on the decision maker+s e*perience
and knowledge and are transformed into tasks for the development teams! &he model we
propose can e leveraged during the decision process to support the decision step# i!e! help the
decision maker formulate tasks for the development team! 2ther than goals and constraints set
y management# the model will proaly take some additional information# e!g! pro)ect-specific
or team-specific characteristics or needs# additional constraints on the team composition or
task assignment# or technical information and data to e used y the model for Gualitative or
Guantitative reference!
&he desirale output of the model should some reliale information the decision maker can
immediately leverage in order to elaorate or refine choices and tasks for the development
teams! &he term HrelialeI means that the output of the model should e ased on some
evidence# i!e! Gualitative and Guantitative information with some form of o)ectivity and
confidence on the part of the decision maker and the top management!
Some form of iteration is also introduced: choices may reveal some team ehaviors or results
which differ from the e*pectations and may reGuire further actionsF similarly# the model could
suggest paths that reGuire an interaction with top management .e!g! the acGuisition of new
skills or products/# which may lead to the ad)ustment of udget or time constraints# as well as
to the introduction of new constraints!
'- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 3 -onceptual view of the model and its conte*t
2f course# the main Guestion is: how can we uild such a smart# evidence-ased modelL First of
all# we want the model to e ale to leverage mathematical and statistical analysis# as well as
data mining techniGues! &his implies the e*istence of data# or# more precisely# a data-driven
approach enaling such methods to operate# with the aim of producing some Hreliale
supportive informationI for making evidence-ased decisions!
5eaders willing to have a concrete e*ample will need to wait until the end of the mathematical
formaliEation of the model! 2ur ne*t step# in fact# is the definition of a mathematical
foundation# which will unveil part of the internal structure of our #lac2 #o*!

'2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g
























Page intentionally left lank!
'9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g



Section II: Section II: Section II: Section II:
I$%#ct Vecto" Moe& I$%#ct Vecto" Moe& I$%#ct Vecto" Moe& I$%#ct Vecto" Moe&
I4here are pro#lem" to .ho"e "olution ) .ould attach an infinitel0 greater
importance than to tho"e of mathematic", for e*ample touching ethic", or our
relation to God, or concerning our de"tin0 and our futureE #ut their "olution lie"
.holl0 #e0ond u" and completel0 out"ide the province of "cienceJ.
Carl :riedrich Gau""
', | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Chapter V. Shaping the Model
#%i% $ssmptions
&he model that will e presented founds on some assumptions that need to e discussed and
made e*plicit in order to agree and find soundness in the rest of the work!
&he first point is to clarify the term HmodelI# which we will use pretty much freely# sometimes
as synonym of theory K i!e! a working hypothesis that is considered proale ased on
e*perimental evidence or factual or conceptual analysis and is accepted as a asis for
e*perimentation K and sometimes to indicate the mathematical formulation that will e
provided in the ne*t Section! Eventually# model# theory and mathematical formulation will
correspond to the same idea .from a Platonic point of view/# which is the same represented in
Figure 3! @e name this the HI;<a=t +e=tor Mo>elI .$0M/! 5easons for such name will e clear
later on!
&he second point to focus is that the $0M reGuires the awareness of its users# i!e! the decision
makers! &his means that the model cannot replace the human )udgment# ut can only assist it
and support it during the decision making process! &his is reGuired to compensate all the
missing aspects of the reality# i!e! those aspects which are not accounted for y $0M# including
gloal economic trends# future opportunities# personal opinion# political influences# hard-to-
model dynamics# and# very freGuently# gaps in availale data!
Since we do not aim to model Hthe wholeI# nor we do e*pect this eing feasile# the underlying
assumptions are:
?A'@ %e a==e<t to >eal Aith an in=o;<lete0BnoAle>Ce ;o>el.
?A2@ %e assu;e the <resen=e of a user =a<aDle to intera=t Aith the ;o>el an>
inter<ret its out<ut a==ountinC for =onteEtual fa=tors that are not ;o>ele> eE<li=itlF.
&he first assumption is pretty much true for every model of the real world# including physical
and mechanical laws! &he counterpart of this assumption is that the model is reGuired to e
Hcomplete enoughI to provide usale and useful information! &he achievement of Henough
completenessI will e shown in the rest of the work!
&he second assumption is hopefully true in many safety- and mission-critical environments#
including the Space and Defense Fields# where decision makers are e*pert# prepared and skilled
enough to address oth technical and managerial prolems! $n particular# we e*pect the
decision makers to have a road view on oth the system under development and its conte*tF
this means that the user should also e ale to figure out when the model cannot e applied#
e!g! ecause of the high incompleteness of data or the singularity of the specific issue to solve!
': | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Based on this second assumption# we derive that the field of applicaility of the model is limited
to those conte*ts in which "9 holds!
#%ii% Characteristics of the Model
,iven these assumptions# we want the $0M to e used in the following scenario:
,iven:
.i/ Some product and process goals and constraints .e!g! a given Guality or a
ma*imum pro)ect time/F
.ii/ &he current status of oth process and productF
.iii/ &he history of the process e*ecuted so farF
.iv/ Potential solutions coming from standards# literature# practice and creativityF
.v/ Data collected overtime .on the current and other pro)ects/F
.vi/ $nformation on the specific conte*tF
the model should provide a set of possile process solutions that will lead to achieving the
pro)ect goals while respecting the given constraints
&his scenario includes# as particular case# the situation in which the decision maker is at the
eginning of the development and no history e*ists!
"n important characteristic we want the $0M to have is that it should e evi>en=e0Dase>: a
particular instance of the $0M should e the result of a systematic empirical investigation of
phenomena! &his means that the model is uilt on the asis of Gualitative and Guantitative
data! Furthermore# its output should e measurale as well# i!e! supported y measures
associated to a well-defined interpretation!
2ne more characteristic is the real0ti;e capaility: we want the model to constantly provide
the decision maker with fresh and reliale information# which can e used to monitor the
progress and to make decision on how to correct the plan or improve it!
Finally# very desirale characteristics of the model also include:
FleEiDilitF: we want the decision maker to e ale to define hisJher own goals on any
characteristic of the product# the process or the conte*t!
*saDilitF: we want the model to provide actionale and Guickly understandale
information!
#%iii% Ta&onom'
@e start formaliEing the model y providing details on some terms we will use! Most of them
are widely spread in the field of SE# ut conclusive definitions are always hard to provide and
2( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

agree upon! @e do not want to speculate on what definition is most appropriate for any given
term# so we will not provide any formal definitions at all# ut )ust a few words to transfer an
intuition of our interpretation of the terms!
Development proce"": a multi-set of process activities e*ecuted according to a given
schedule!
Development proce"" activit0: a task that reGuires resources to e e*ecuted and that
causes a development process to evolve ecause of its e*ecution!
Product: an artifact or set of artifacts# either tangile or not# which is output of some
development process activities!
Product attri#ute: a measurale characteristic of a product!
Proce"" attri#ute: a measurale characteristic of a process!
ProFect attri#ute: a measurale characteristic of a pro)ect!
Conte*t attri#ute: a measurale characteristic of the environment where a process is
enacted to realiEe a product!
Scale ?"cale of mea"ure@: a model which enales a variale to e measured and assigned
a value in some set of admissile values!

Chapter VI. Mathematical Fondation
#"%i% Formlation
@e now define our model y providing some formal definitions and properties!
3! Be X A M N*O* is a product attriute# process attriute# pro)ect attriute or conte*t
attriuteP! 8 is the suset of A containing only the attriutes we are interested in
considering!
Every element in 8
k
# for any given kQ: is named I;<a=t +e=tor .$0/!
9! Be A X
n
M NaOa is a development process activityP# where nMO8O and X
n
denotes the n-
fold -artesian product over 8! Furthermore# an activity is defined as a vector of product
attriutes# process attriutes# pro)ect attriutes# and conte*t attriutes with no
repetitions# i!e!:
IDo not .orr0 a#out 0our difficultie" in ,athematic". )
can a""ure 0ou mine are "till greaterJ.
!l#ert Ein"tein
2' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


o X
n
. o = x = x

onJ i ] x

x
]
F! 3
@e name such vector A=tivitF Perfor;an=e I;<a=t +e=tor ."P$0/!
>! Be A X
n
the set of all software development activitiesF then# a development process is
a seGuence of Eero or more activities# i!e!# e D the set of all development processes#
then:

(X
n
)

F! 9
@e name Pro=ess Perfor;an=e I;<a=t +e=tor .PP$0/ an element in (X
n
)

!
A! Be 0 M NvOv is a scaleP!
B! Be the admission function adm: X I# i!e! the function that maps each attriute to
the set of its admissile values .i!e! to its scale/!
o $t holds that each attriute has one and only one scale# i!e!:

x X +! : I . : = oJm(x) F! >
;! Be Dist M NdistOdist is a proaility distriution function or a mass function on v I P the
set of all proaility distriutionJmass functions on any scale!
<! &he A=tivitF Perfor;an=e Fun=tion ?APF@ perJ: A B (X Bist)
n
represents the
vector of n proaility densityJmass functions# reminding that nMO8O! $nformally# we are
saying that# given an activity a in a process d# the performance of the activity is a vector
of n proaility distriutionJmass functions# each representing the performance in
terms of an attriute!
4! &he Co;Dination Fun=tion ?CF@ cumh: (X ist)
n
(X ist)
n
(X Bist)
n
maps a
pair of !P)(" of a process - each associated to its performance - to a single $mpact
0ector# having its attriute associated to their performance .i!e! random variales/!
$nformally# we are reducing the performance of two activities in a process to the
performance of one eGuivalent activity!
7! &he Pro=ess Perfor;an=e Fun=tion ?PPF@ ppJ': (X ist)
n
for a process d of k
activities is defined recursively as follow:

pp'(J
]
) = _
comb(pp'(u
]-1
) , pcr(o
]
, J)) i u < ] k
null i ] = u


F! A
where the notation J
]
indicates the su-development process composed of the first F
activities .u ] k/ of the process d and null denotes the "ull I;<a=t +e=tor ?"I+@#
whose properties we will define later!
Something to highlight is that# ased on this definition# the performance of each activity .and#
in particular# the performance of each activity in terms of each attriute/ depends on the entire
22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

processF in fact# the second parameter of com# is pcr(o
]
, J)# which computes the
performance function for the activity a as )-th activity of the process d! @e now want to make
an assumption to simplify the model:
?A3@ The <erfor;an=e of an a=tivitF >oes not >e<en> on the future of the
>evelo<;ent <ro=ess.
&herefore# the definition of the PPF ppf for a <ro=ess d composed of k activities ecomes:

pp(J
]
) = _
comb(pp(u
]-1
) , pcr(o
]
, J
]-1
)) i u < ] k
null i ] = u

F! B
$n this final formulation of PPF# the performance of the )-th activity depends on the process up
to the .)-3/-th activity# not on the ones which will e e*ecuted after the )-th activity!
$t is important to remark that the choice of an activity can oviously depend on the future of
the processF nevertheless# the independence from the future# that we are assuming# is limited
to the performance of that activity# not to the asence of correlations or synergies among
activities during process planning! 'owever# the assumption "> might still e violated# e!g!
ecause some psychological dynamics can improve or worsen the performance of some
developers who are aware of ne*t planned activities! Some studies report Guantitative analyses
on these dynamics K e!g! H"n Empirical Study on Software Engineers Motivational FactorsI# y
FranRa and da Silva K# ut# in order to keep the comple*ity of this model at a manageale level#
we will keep assumption "> until the end of this work and demand the prolem of managing
the social and motivational factors to the decision maker+s consideration .see assumption "9/!
2ne more important point to notice is that the model does not forid that the performance of
an activity in terms of an attriute may depend on the performance of other attriutes! $n fact#
this reflects something realistic: )ust to give an e*ample# we can imagine that there should e
attriutes like Hcomple*ity of the productI and that the e*pected cost of testing depends on
.i!e! is a function of/ such comple*ity!
%otice also that# at the moment# the "PF ."ctivity Performance Function/ perf and the -F
.-omination Function/ com# are still undefined and we postpone their discussion to the ne*t
chapter!
#"%ii% (sing the model
@e now provide an e*ample of use of the model! ,iven the trivial development process d
3
M
SreGuirementsF implementationF testT# we want to compute its performance in terms of cost#
recall .i!e! percentage of removed defects/ and amount of produced material .apm/!
"ccording to the previous paragraph# we can write:
23 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

8 M Ncost# recall# apmP
D M Nd
3
P
" M NreGuirements# implementation# testP
Moreover# we aritrarily define the following scales:
0 M N0
cost
# 0
recall
# 0
apm
P
o 0
cost
M NcOc is a non-negative real numerP!
&he real numer is assumed to represent the cost in U!
o 0
recall
M Nnot applicale# low# highP!
&he value Hnot applicaleI means that an activity having this value in the
recall dimension does not affect the recall anyhow! &he value lo.
indicates that less than >>V of e*isting defects are detected y the
activity# while high means that at least >>V of the e*isting defects are
identified y the activity!
o 0
apm
M Nvery small# small# medium# high# very highP!
&he values of the ordinal scale indicates if the amount of pages is ver0
"mall .:-9 pages/# "mall .>-B/# medium .;-3:/# high .33-9B/# or ver0 high
.T9B/! @e can think of some conversion factors# e!g! for code#
transforming lines of code to pages! 'owever# in this e*ample# we are not
really interested in this level of detail and we prefer to leave the intuition
to the reader!
"dditionally# the admission function is completely defined as follow:
admMNScostF 0
cost
T# SrecallF 0
recall
T# SapiF 0
apm
TP
@e can also aritrarily assume that the performance function is defined as follow:
perf?reGuirements,PQ@RP
re3uirement", PQ

perf?implementation, SreGuirementsT@RP
implementation,Pre3uirement"Q

perf?test, SreGuirements# implementation
Q
@R P
te"t,Pre3uirement", implementationQ

where P
W#SWT
are vectors of > random variales K one each for cost# recall and apm K associated
to the proaility distriutionJmass functions constituting the co-domain of the function perf
.i!e! the "PF/! 6et us assume we somehow know that such random variales have the following
distriution and mass functions:
P
re3uirement",PQ
M S
Scost# %ormal.B:::#9//T
Srecall# S S3::V# not applicale T T T
24 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Sapi# S S:V# very lowT# S;:V# lowT# S>:V# mediumT# S3:V# highT# S:V# very
highT T T T
P
implementation,Pre3uirement"Q
M S
Scost# %ormal.3B::#3//T
Srecall# S S3::V# not applicale T T T
S api# S S:V very lowT# S:V# lowT# S3:V# mediumT# S9:V# highT# S<:V# very
highT T T T
P
te"t,Pre3uirement", implementationQ
M S
Scost# %ormal.3:::#:!3//T
Srecall# S SB:V# lowT# SB:V# highT T T
S apm# S S:V very lowT# S:V# lowT# SB:V# mediumT# SB:V# highT# S:V# very
highT T T T
%ow we need to compute ppf.d
3
/ and the steps are provided elow!
$n order to support the reader# the two arguments of the functions com# and perf are
formatted and numered hierarchically# according to their depth in the function call stack! "lso#
we highlight with colors some invocationsF the same color marks the same invocation of the
com# and perf functions at different steps of the computation!
<<f .d
3
/ M
G <<f .SreGuirementsF implementationF testT/ M
M =o;D .
'. <<f .SreGuirementsF implementationT/#
2. <erf .
9!3! test#
9!9! SreGuirements# implementationT// M
M =o;D .
'. =o;D .
'.'. <<f .SreGuirementsT/#
'.2. <erf .
3!9!3 test#
3!9!9 SreGuirements# implementationT//#
2. <erf .
9!3! implementation#
9!9! SreGuirementsT// M
M =o;D .
3! =o;D .
3!3! =o;D .
2- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

3!3!3! <<f .ST/
3!3!9! <erf .
3!3!9!3 test#
3!3!9!9 SreGuirements# implementationT//
'.2. <erf .
3!9!3 implementation#
3!9!9 SreGuirementsT/
9! <erf .
9!3! reGuirements#
9!9! ST//
@e do not need to e*pand the e*pression any longer# in fact: ppf.ST/ is known and is the %ull
$mpact 0ector .%$0/ y definition of ppfF the perf function is known and e*plicitly defined for all
reGuired values of its argumentsF and the comination function com# is e*pected to provide an
element in 8
>
# even though it has not een defined# yet! @e can intuitively say that# for cost#
the com# function is a simple sum: computing the cost of two activities is eGual to add the cost
of the first activity to the cost of the second activity .computed eing aware of the application
of the first activity/! More specifically:
Normol(Suuu,2) + Normol(1Suu,1) + Normol(1uuu,u.1) ~ Noimal(7Suu, S.1)
For recall and apm# instead# we have a different scale type .ordinal/ and we cannot have an
immediate similar intuition .and the same would hold for nominal scales/! -oncerning the
recall# however# it is still fairly easyF in fact# only one of the activities has a relevant value# i!e!
test# ecause the Hnot applicaleI value of the scale means that an activity does not affect the
recall! &herefore we can say that the recall of the process is a random variale with a mass
function of the form SSB:V# lowT# SB:V# highTT!
Chapter VII. $ddressing the )pen Problems of the
"mpact #ector Model
$n the defined models a numer of issues can e identified! $n the following# the most relevant
ones are pointed out and a solution is sketched for each!
22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

#""%i% $cconting for Parallel Process $cti*ities
"s stated in the previous chapters# the current mathematical formulation of the $mpact 0ector
Model does not support parallelism of process activities# i!e! multiple activities eing e*ecuted
at the same time!
$n order to model the parallelism of activities# we e*tend the definition of activity to include a
new attriute! &his attriute is a pair representing an interval of e*ecution of the activity with
respect to a nominal time .i!e! a reference time/! @hen two activities a
3
and a
9
are e*ecuted at
the same time# e!g! they start at the same time t
3
and end at the same time t
9
# the activity
attriutes have oth the pair St
3
#t
9
T! @hen the time overlap etween the two activities is
partial# then the nominal time reflects this partial overlapF e!g! if a
9
starts when half a
3
has een
e*ecuted# then the attriute value for a
3
is St
3
# t
9
T and the one for a
9
is St
3!B
# t
>
T! %ominal time
shall e measured in a ratio scale!
Following this e*tension# the nominal duration of the process is t
f
-t
i
# where t
f
is the end time of
the last activity of the process .i!e! the one ending last/# while t
i
is the start time of the first
activity of the process!
" further e*tension can support modeling activities that are not continuous# ut that can e
suspended and resumed in the futureF the e*tension consists in having a seGuence of pairs
instead of a single pair as attriute! Pairs of this seGuence shall e totally ordered with respect
to timeF therefore there should e no overlap etween intervals of e*ecution .i!e! an activity
cannot e ongoing and suspended at the same time/!
&he introduction of such an e*tension .either the one supporting activity suspend-and-resume
or the former one/ has an impact on the comple*ity of the prolem: in fact# without the
e*tension# interactions and synergies etween activities only depend on their order of
e*ecution# ut not on their simultaneity! By introducing the concept of time# instead#
simultaneity of activities may lead to much higher a comple*ity: multiple activities may interact
and provoke changes in performance and causality can ecome hard to understand! E!g! a new
Guestion may arise: which activity .or activities/ is causing a performance variation on which
other activity .or activities/L "n e*ponential grows of potential answers shall e e*pected with
the increase of the numer of simultaneous activities! Furthermore# also partial overlap of
activities shall e accounted for!
Because of this comple*ity# an e*haustive analysis of all possile and reasonale situations is
not viale! 'owever# e*tending the $mpact 0ector Model to support simultaneous activities is
necessary# as time is of undeniale importance in every practical conte*t! %evertheless# despite
the uncountale realistic scenarios one can encounter# it is e*pectale that in many conte*ts
only some activities are e*ecuted simultaneously# as well as only some seGuences of activities
29 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

are implemented# while most admissile cominations# though reasonale# are not of interest!
&his can cut down the comple*ity of the theoretical prolem to a more practical and treatale
task! &his way# decision makers can perform targeted investigations# e!g! if they detect that a
particular activity is sometimes e*ecuted parallel with another activity .e!g! they start and ends
at the same time/# while sometimes they are e*ecuted in isolation# they could investigate the
e*istence of an overall performance variation y targeting those two specific process
configurations .i!e! parallel or isolated activities/F assuming this standpoint avoids an e*tensive
and e*haustive empirical investigation trying to cover all possile configuration of process
activities!
#""%ii% "mpact #ector Constraints
2ne more aspect we want to report is that# in some conditions# not all values of a scale are
reasonale or feasile .e!g! a pro)ect manager wants to put a cap to costs# which makes some
values of the corresponding scale not acceptale/! 'ow can the model account for such
conditionsL
$n order to satisfy this need# we formally introduce the concept of "ttriute -onstraint ."-/# i!e!
a function that# given a scale of values# maps it into a new scale of values that is a suset of the
former scale! Even though the formaliEation effort for this e*tension is small# it turns into great
enefits in practice! $n fact# putting "-s may filter out a ig set of alternative solutions that# for
a given conte*t# are not feasile# not viale or considered not convenient!
&he usefulness of "-s ecome evident when trying to uild automation engines to elaorate
impact vectors: y putting constraints# the search space can e drastically reduced# making the
computation accomplishale in acceptale time!
2ne more type of constraints relates to activity seGuencing: some activities may e e*ecuted
only after some other activities have een already e*ecuted# e!g! ecause they reGuire artifacts
produced y some previous activities! For similar situations# we formaliEe seGuencing
constraints y e*tending the Process Performance Function .PPF/ as follow:
pp(J
]
) = _
comb(pp(u
]-1
) , pcr(o
]
, J
]-1
)) i u < ] k
null i ] = u
unJcincJ i o
]
must bc cxccutcJ bcorc onJ octi:ity o J
]-1


&he undefined value means that the performance of the process d
)
cannot e computed! Such
formulation further helps create engines for automatically elaorating impact vectors# ecause
the search space can e reduced y e*cluding undefined seGuences of activities!
2, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

#""%iii% Empirical Definition of the $cti*it' Performance Fnction
and Combination Fnction
&he ne*t point we need to cope with concerns the perf and com# functions! &hat is proaly
the hardest aspect to define# ut it leads to the most valuale result of using the $mpact 0ector
Model# i!e! the process performance# oth e*pected and historical# at the level of granularity
chosen y the decision maker!
For e*ecuting a performing development process# some good practices can often e identified
and leveraged# as well as some ad practices can e detected and discouraged! 'owever# on
the side of these practices# in order to i/ find local# tailored and specialiEed optima# ii/
concretely and practically instantiate such practices in specific situations# and iii/ interpret the
otained results# every conte*t needs its own models!
&herefore# other than gloally valid information .e!g! est practices/# there is some local
information to account for! Such local information is specific to a given conte*t# ut it is not
necessarily one-shot! $n a totally different environment# this local knowledge might still e
useful! 6et us analyEe when this could happen!
@hen configuring the $mpact 0ector Model for a given conte*t# one of the steps is the selection
of performance attriutes of interest! "mong such attriutes# ig importance has to e
assigned to characteriEation attriutes# i!e! those attriutes that represent distinctive aspects of
the process and the product under development# e!g! the comple*ity of the pro)ect and the
domain of application! Doing so corresponds to preparing statistical models and data mining
agents for a correct analysis# ecause such attriutes allow slicing and dicing of data in order to
find meaningful trends!
For e*ample# suppose a company -3 is working on two product lines# one aout moile ,1$s for
e*isting wesites and one aout native moile apps for home automationF an aggregate
analysis can reveal some gloal and conte*t-specific knowledge# ut a y-product line analysis
may reveal some different and product line-specific knowledge! Packaging such product line-
specific knowledge can represent an added value to e leveraged y someone else# when the
product line is appropriately characteriEed! Deeping on this e*ample# in fact# we can imagine of
a Defense company -9 that has to develop a wrapper for its intranet portal to offer
functionality on moile! Data collected y -9 are hardly usale to predict the performance on
such taskF ut if -9 could access data of -3# limited to the product line on moile interfaces#
then -9 could gather something useful# e!g!# -9 could get to know that the presence of one
person highly skilled on the target platform may cut down cost y A:V# in the average# and up
to ;:V when the target platform is i2s! &he aforementioned appropriate characteriEation of
the conte*t could help -9 find the needed information via data look up and e*ploration!
2: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

@e postpone to the last Section of this document a discussion on the so called Empirical
Software Engineering 9!: X3Y# which matches our viewpoint on sharing data across companies
and researchers!
Zumping ack to the point of this chapter# we can now imagine of having a repository with data
as previously descried# i!e! vectors of attriutes representing the performance of an activity
and including characteriEation attriutes to use for data look up and e*ploration! Such a
repository could e Gueried and automatically analyEed via statistics and data mining to predict
what the performance of an activity .or an entire process/ will e# ased on the characteriEation
of the current conte*t to slice and dice the repository and use only the data representative and
really usale for a prediction in that specific situations# in terms of oth a single activity
performance ."ctivity Performance Function/# and multiple activities .-omination Function/!

3( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g
























Page intentionally left lank!
3' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g






Section Section Section Section III III III III: : : :
I$%#ct Vecto" I$%#ct Vecto" I$%#ct Vecto" I$%#ct Vecto"
Moe&: Le'e"#(in( Moe&: Le'e"#(in( Moe&: Le'e"#(in( Moe&: Le'e"#(in(
the the the the E)ce&&ence E)ce&&ence E)ce&&ence E)ce&&ence
IDot to 2no. .hat happened #efore 0ou .ere #orn i" to
remain forever a childJ.
,arcu" 4ulliu" Cicero
32 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n Section $$$ we perform a comparative analysis etween the $mpact 0ector Model and some
previous work in the field of Software Engineering# specifically for process and Guality
Management during software development!
Chapter VIII. !here do "mpact #ectors Come from+
&he $mpact 0ector Model is the results of leveraging# reinterpreting# merging# slicing and dicing#
adapting and augmenting a numer of pre-e*isting models in software K and not only K
literature! &he iggest influence has een provided y the =uality $mprovement Paradigm# y
,=M?S# and y the 0alue-Based approaches applied to Software Engineering! For each of
those# a chapter provides a summary description and a close e*amination on how they interact
and relate to the $mpact 0ector Model! " minor# ut still crucial contriution to our model has
een given y further influent paradigms# namely -MM$# SP$-E and Si* Sigma! "dditional
methods and techniGues should oviously e leveraged to implement the $mpact 0ector Model#
e!g! the @eighted 2)ectives Model# Statistical Process -ontrol# 5oot -ause "nalysis#
5egression "nalysis# checklists and so on! $n the ne*t chapters# we will mostly focus on the
aspects that the $mpact 0ector Model has in common with other approaches and
methodologies# and sometimes we will mention techniGues or methods that can e used to
address specific technical issues! 'owever# we do not aim to carry on a detailed comparative
analysis etween impact vectors and other approaches# ut only to highlight what impact
vectors are leveraging and reusing from the past!
Chapter IX. Enabling #ale,-ased Decisions *ia the
"mpact #ector Model
&he 0alue-Based Software Engineering .0BSE/ is a discipline of Software Engineering that
heavily accounts for usiness aspects related to software development! "mong the pioneers of
0BSE is Barry Boehm! 2ne of the ooks he co-authored K H0alue-Based Software EngineeringI K
contains# in its first chapter# a numer of key-drivers for the definition and use of the $mpact
0ector Model! &he re-elaoration and synthesis of Boehm+s words# reported elow# highlights
the most relevant aspects for our model!
M,o"t "tudie" of the critical "ucce"" factor" di"tingui"hing "ucce""ful from failed "oft.are
proFect" find that the primar0 critical "ucce"" factor" lie in the value domain! )n earlier time",
.hen "oft.are deci"ion" had relativel0 minor influence" on a "0"temG" co"t, "chedule, and
value, a value neutral approach .a" rea"ona#l0 .or2a#le: ever0 re3uirement, u"e ca"e,
33 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

o#Fect, te"t ca"e and defect .a" treated a" e3uall0 important and the re"pon"i#ilit0 of
"oft.are engineer" .a" confined to turning "oft.are re3uirement" into verified code.O
'owever# H"oft.are ha" a maFor impact on co"t, "chedule and valueO of today+s systems!
Furthermore# value-neutrality clashes with the definition of engineering itself# which recalls to
the application of science and mathematics in order to produce results useful to people!
&he 0alue-Based Software Engineering .0BSE/ aims at Hintegrating value con"ideration" into the
full range of e*i"ting and emerging "oft.are engineering principle" and practice"O! &his enales
making value-ased decision# i!e! decisions driven y the awareness of what is most valuale
doing in order to achieve the defined usiness goals! $n particular# decision makers Hneed to
have a clear under"tanding of the "ource" of value to #e e*ploited and lin2 #et.een technical
deci"ion" and the capture of valueO!
M(alue-#a"ed deci"ion" are of the e""ence in product and proce"" de"ign, the "tructure and
d0namic management of larger program", the di"tri#ution of program" in a portfolio of
"trategic initiative", and national "oft.are polic0. -etter deci"ion ma2ing i" the 2e0 ena#ler of
greater value addedO.
&hese words remark the sustantial need of including value considerations when trying to
address decision making issues! $n the cited ook# the authors also shape a theory of value for
SE# elaorating# adapting and leveraging ideas and theories of value defined in fields other than
SE! &hat is only one of the works that have een done on 0BSE and we can reasonaly assert
that a fairly strong and fervid ackground e*ists for 0BSE! &hough interesting# we will not focus
on the details of 0BSE# as it is not the core of this work! 5ather# the goal of this paragraph is to
focus how the $mpact 0ector Model enales value-ased decision making! $n order to focus this

Figure 9 ,eometrical representation of the $mpact 0ector Model!
"9
"3
$mpact 0ector -onstraint
D3
D9
D>
$deal path
c
o9
o3
34 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

point# we want to propose a graphical representation of the $mpact 0ector Model! Because of
the limits of our paper sheets and our aility to deal with 9 dimensions only# we will propose a
trivial e*ample in which we consider only two attriutes ."3 and "9/ and three processes .D3#
D9 and D>/!
$n Figure 9 many elements are represented! D3# D9 and D> are represented as seGuences of#
respectively# A# 3 and B activities# where each activity is indicated as a old dot! "ll development
processes start from the origin# i!e! the %ull $mpact 0ector! Every activity is a move in the 9D
space we are considering! &he move corresponds to the impact that the activity has on the
development process! Each point in the space .i!e! an $mpact 0ector/# in fact# represents a
performance in terms of "3 and "9! $n particular# each point of type Sa1# a2T indicates that the
performance is a1 in terms of "3 and a2 in terms of "9! &he ig green dot represents an
optimum# i!e! the point that# ideally# the decision maker would like to achieve! For e*ample# if
we think of "3 and "9 as time and cost# the green dot represents the ideal tradeoff for a
process in terms of cost effectiveness# i!e! a time o1 and a cost o2! Finally a portion of the space
is marked with a curved line# which represents a limit to the admitted values for the attriute
"9# i!e! an $mpact 0ector -onstraint! "gain# if "9 represents cost# then the value c for "9 is the
ma*imum udget that can e used y a process we want to consider as acceptale!
$n this e*ample# no development process is e*pected to get to the ideal performance .green
dot/! @e have three options# ut one of them is to e discarded! $n fact# the process D9 ends up
in the foridden space# as it violates the constraints on "9! D3 and D> are the remaining
candidates: how can the decision maker chooseL 'ere comes the idea of value-ased decision
making! Based on the graphical representation# the decision maker should start reasoning on
the relative importance of "3 and "9! $n fact# if we still think of "3 and "9 as time and cost# D>
seems to cost less than D3# ut it is e*pected to take longer! @hich alternative is the estL &he
answer definitely depends on the conte*t! $f a new pro)ect is e*pected to start shortly# the
decision maker may want to e faster to complete the current pro)ect# even though it is more
e*pensive# so to free resources for the new pro)ects! $n different conditions# however# the
decision can e differentF therefore the model cannot really provide the conclusive answer!
@hat the model can do is to provide the decision maker with the entire amount of information
availale in order to himJher to make a value-ased decision!
@e also want to remind that another desirale characteristic is the evidence-ased foundation
of the model: how can we introduce Guantitative reasoning to support value-ased decisions#
thenL " perfect fit could e the @eighted C#Fective" ,ethod ?8C,@# a Guantitative method
descried y -ross in his HEngineering design methods: strategies for product designI# and
commonly used in the field of Systems Engineering to evaluate a set of candidate solutions to
the same technical prolem! $t consists in computing a single score for each alternative and
3- | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

picking the solution corresponding to the highest score! $n our conte*t# the alternatives are the
development processes .e!g! D3 and D>/# while the scores can e computed y following four
steps:
3! "ssign a weight to each dimension of the space .e!g! cost and time/# so that the higher
the weight# the more important the dimensionF notice that the weights add up to 3!
9! Define a ratio scale# with minimum and ma*imum values# representing the e*tent to
which a given alternative is appropriate with respect to the given dimension!
>! For each dimension# assign a value of the scale associated to the current alternativeF
notice that# regardless of the dimension# the method works in such a way that the
higher the value# the etter is the alternative .e!g!# in terms of cost# the cheapest
solution is e*pected to have the highest score/!
A! For each alternative# sum the scores assigned to each dimension multiplied y its
weight!
%otice that# for dimensions originally associated to a ratio scale# step 9 can e skipped and step
> is systematic# i!e! the score for those dimensions can e analytically computed ased on the
value presented y each alternative for those dimensions! For ordinal and nominal scales#
instead# step 9 and > are the enalers to the use of the weighted o)ectives method and must
e e*ecuted!
@2M is )ust one of the possile methodsF it is easy to apply and does not reGuire a lot of effort!
2n the other side# there are some more accurate methods that have een proposed in the past
X9# ># A# BY!
'owever# the selection of the most appropriate method# which may also e conte*t-specific
and vary according to the availale resources and support tool# is not the focus of this chapter!
&he point that we want to make in this chapter is that with any of such methods# we can give
rigor to the way the decision maker could leverage the information represented geometrically
in Figure 9# in order to support hisJher decision making in a value-ased fashion!
Figure > "n e*ample of impact vectors
32 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

@e also want to remind the importance of fle*iility# listed in : as one of the desirale
characteristics of the framework! Being ale to accommodate different needs .i!e! different
goals/ is a first step towards fle*iility and# y using the @2M together with impact vectors# as
different weight assignments to vector attriutes can lead to different outcomes and
conseGuently to different choices! 2f course# using @2M is not mandatory and a decision
maker can pick any preferred method to plug onto the impact vector to assign importance to
the attriutes!
Chapter X. ."P and G.M: Continos Process
"mpro*ement *ia "mpact #ectors
/%i% ."P: .alit' "mpro*ement Paradigm
&he =uality $mprovement Paradigm .=$P/ is a model designed to support continuous process
improvement and engineering of the development processes# and to help in technology
infusion! &he =$P takes the premise that all pro)ect environments and products are differentF in
fact# software development is human-ased and design-intensive# which makes the use of
statistical control .e!g! used for manufacturing/ e*tremely hard to apply! &herefore# the idea of
statistical control of processes is rought into Guestion and replaced# at least partially# with the
need for constant e*perimentation!
&he =$P closed-loop approach is
represented in Figure A! &he =$P cycle
is composed of two closed loop
cycles: organiEational and pro)ect! &he
organiEation-level cycle provides
feedack to the organiEation after the
completion of a pro)ect# whilst the
pro)ect-level cycle provides feedack
to the pro)ect during the e*ecution
phase# with the goal of preventing
and solving prolems# monitoring and
supporting the pro)ect# and aligning
processes with goals!
"dditional details on each phase of
Figure A! =uality $mprovement Paradigm!
39 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

=$P will e provided later on# when descriing the ,=M# which can leverage# and is
complementary to the =$P .and vice versa/! During that description# we will also remark how
the $mpact 0ector Model complies with and e*tends =$P .other than ,=M/!
X.ii. G.M0S: Goal1 .estion1 Metric 0 Strategies
&he ,oal# =uestion# Metric .,=M/ is an approach to software measurement proposed y 0! 5!
Basili and the Software Engineering $nstitute at the %"S" ,oddard Space Flight -enter! &he
approach founds on three concepts: goals# Guestions and metrics!
,oals can e defined according to the ,=M template as follow:
S=he;a EEa;<les Senten=e
2)ect Process# product# resource[ "nalyEe S[T
Purpose -haracteriEe# understand# evaluate# predict# improve For the purpose of S[T
=uality Focus -ost# correctness# reliaility[ @ith respect to S[T
0iewpoint -ustomer# manager# developer# corporation[ Form# the point of view of S[T
-onte*t Prolem factors# resource factors# process factors[ $n the conte*t of S[T
&ale 3! ,=M template!
Starting from defined goals# Guestions shall e generated# so to make the goals Guantifiale!
Such Guestions should address those aspects relevant for the achievement of the goals# e!g!#
they could aim at identifying relevant process areas and at determining the influential
characteristics of the product# and conte*t factor that impact the performance of teams!
,iven the Guestions# measures shall e specified to collect data in order to answer the
generated Guestions and track processJproduct progress with respect to the goals!

Figure B &he ,oal# =uestion# Metric
" concept underlying ,=M is that measurement is not a goal per se# ut it is functional to an
end which has a value for the organiEation! Such value is e*pressed in the form of goals! &he
e*tension of ,=M to ,=M ? Strategies .,=M?S/ addresses specifically this issue y introducing
3, | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

the concept of strategy# i!e! a possile approach for the achievement of a goal in the conte*t
under study! -onseGuently# a multi-level hierarchy of goals and Guestions should e created in
order to sufficiently address# decompose and detail the top-level goals# so to accommodate
organiEational structure and tractaility of the prolem!
$n order to precisely illustrate how the $mpact 0ector
Model can e plugged in to ,=M?S .or vice versa#
depending on the point of view/# we will first e
traveling through the process to implement the
,=M?S approach in an organiEationF then# we will
e*plain which steps of the process can e affected
and augmented y the $mpact 0ector Model!
&he process can e summariEed in < steps# which
retrace the =$P .steps 9-</!
3! $nitialiEe! &he management should e made
aware of and prepared to implementing the
approach! &his includes staffing# assigning
responsiilities# training# and planning
schedule and cost!
9! -haracteriEe! Processes# products# pro)ects# technologies# customers and organiEation
should e characteriEed# as well as pre-e*isting measurement programs .e!g!
repositories# goals or models/!
>! Set goals! 2rganiEational units .e!g! usiness units/ involved should e identified and
interrelationships should e detected! Furthermore# the current status should e
precisely descried and goals should e set for each organiEational unit# along with
potential issues# strategies for achieving the goals and data to e used for assessing the
goals! &his step includes the definition of oth metrics and interpretation models to
evaluate success at achieving goal!
A! -hoose process! -reate or reuse measurement programs to collect and analyEe data#
providing the appropriate level of detail to precisely specify who is responsile for
collecting which metrics and the technical procedure to collect them .schedule and
technologies/!
B! E*ecute model! 5un the strategies# collect and monitor data# assess consistency and
completeness and provide real time feedack and ad)ustments!
;! "nalyEe results! Filter and prepare data# create descriptive statistics# visualiEe# and
e*plain and evaluate results according to the interpretation models and aselines!
Figure ;! ,=M ? Strategy!
3: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

<! Package and improve! Disseminate results# perform a cost-enefit analysis and store
reusale and relevant outcomes! -onnect to step 9 for ne*t round of goals!


Figure <! ,=M process!
/%iii% Modeling Choices and $lternati*es with "mpact #ectors
&he $mpact 0ector Model settles on and augments the steps of =$P and ,=M! First of all# we
focus on the process# which is mostly shared y =$P and ,=M!
"fter the first step of initialiEation# the ,=M proceeds with the characteriEation phaseF in term
of $mpact 0ector Model# during step 9# the set 8 of Hprocess# product# pro)ect and conte*t
attriutesI shall e defined! From a geometrical point of view# characteriEing the elements
listed in step 9# in fact# consists in defining the $mpact 0ector space# i!e! the dimensions of the
impact vectors!
During step ># the o)ective Process Performance $mpact 0ector .PP$0/ is defined! Multiple
usiness goals can still e summariEed in a single PP$0: its coordinates# in fact# encompass all
interesting attriutes and different o)ective values can e set on any numer of attriutes!
4( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Step A# i!e! the definition and planning of data to collect# and methods for their collection# is
conceptually unchanged and it consists in the determination of actions to collect data and
organiEe them in the form of impact vectors!
During step B# the geometrical interpretation of the $mpact 0ector Model provides a
Guantitative way of measuring and evaluating the progress! &he distance etween the three
points - initial PP$0# current PP$0 and o)ective PP$0 K represents the e*tent to which the goal
has already een achieved# i!e! its progression! &he pro)ection of the current PP$0 onto the
dimensions on which goals have een set during step > represents the progress with respect to
each single goal! Based on such progress# the decision maker could enact some changes!
'owever# ,=M?S does not provide the decision maker with any guidelines or methods on how
to pick the est alternative# ecause actually the concept of Halternative strategyI is not e*plicit
in ,=M?S!
&his does not mean that ,=M?S is ignoring this issue: step <# in fact# includes storing the data
during last iteration into a knowledge ase! Such knowledge is something that can e leveraged
to make decisions# and# in step A# it is e*plicitly stated that e*isting measurement plans should
e accounted for and reused as possile to set new goals and make choices!
%evertheless# no systematic ways to reusing previous plans are part of ,=M?S! More in
general# the concept of alternative is not e*plicitly dealt with in the approach and choice" are
demanded to implementers+ craft! 'ere the $mpact 0ector Model can provide its contriution!
@hen previous and potential choices are modeled as impact vectors# estimations and
predictions can e uilt upon them# each of which leads to a point in the impact vector space!
$n practice# instead of choosing a single strategy# a decision maker picks the entire set of known
possile alternatives and uilds an augmented ,=M graph! &his way# something similar to
Figure > can e created! &he ne*t action has already een descried in :: value-ased decision
making is enaled and the decision makers can apply the preferred techniGue to Guantitatively
come up with the most promising strategies! &hese strategies will e the nodes in the new
,=M graph and the ,=M?S process will continue!
"s already pointed out# the $mpact 0ector Model can e plugged in also to step B of the
process# i!e! when real-time feedack are provided at pro)ect-level instead of corporate-levelF in
this case# the type of choices will e different# e!g! selecting a technology or a testing tool# ut
the procedure for determining the most promising strategies remains the same!
$t is fairly ovious# ut still useful to remark# the high influence of =$P on ,=M?S# and# in turn#
of =$P and ,=M?S on the $mpact 0ector Model! $n particular# impact vectors perform est only
in presence of a continuous learning cycle and when evidence-ased approaches are planned to
pervade the software development process! Furthermore# since the organiEation of the impact
4' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

vector repository# as well as the use of impact vectors# reGuires some form of analysis# synthesis
and re-elaoration# it would e useful to have an ad hoc group mediating development teams
and repository# e!g! an E*perience Factory X;# <# 4Y# so to minimiEe the formaliEation effort and
the population of the space of alternatives!
Chapter XI. "mpact #ector Model: Points of Contact
with CMM"1 "S)2"EC 34456 and Si& Sigma
Ca<aDilitF MaturitF Mo>el InteCration .-MM$/ X7Y: this is a process improvement training and
certification program and service administered and marketed y -arnegie Mellon 1niversity!
-MM$ aims to increase the usiness value produced y the organiEation via enhancement of
the development process# and our $mpact 0ector Model shares this -MM$ aim!
"ccording to -MM$ authors# -MM$ can e used in process improvement activities as a
collection of est practices# i!e! a framework for organiEing# prioritiEing and coordinating
activities! 'owever# -MM$ is not a process# ut rather it descries the characteristics for an
effective process to have! -MM$ holds oth for improving technical processes# including
development processes# service processes# and acGuisition processes!
$n order to otain Hproce"" improvementI# -MM$ identifies 99 Hproce"" area"O# so that activities
composing a process to improve can e mapped to one or more process areas to address! For
each process area# there are specific Hgoal"I# for which specific Hpractice"I are e*pected! "
process area is a cluster of related practices that# when implemented collectively# contriute
satisfy a set of .-MM$/ Hgoal"I considered important for making improvement in that area!
Depending on which practices are eing implemented# an organiEation can e rated from level
9 to level B# which corresponds to the capaility of the organiEation! $n order to achieve a given
level .capaility/# a pre-determined suset of process areas must e addressedF every level adds
some process areas to the ones that are necessary to achieve the lower level!
Specific practices can e composed of H"u#-practice"I# i!e!# detailed descriptions providing
guidance for interpreting and implementing a specific practiceF in our ta*onomy# the -MM$
terms Hspecific practiceI or Hsu-practiceI correspond to HactivityI! &he -MM$ idea of HgoalI# in
the $mpact 0ector Model# is meant only in terms of measurement# i!e! our reGuirement is that
each -MM$ goal should e stated y defining a goal vector# i!e! a PP$0! &herefore# the definition
of a goal in our model is more stringent .more similar to HgoalsI in the ,=M paradigm/ than
-MM$ goal definition!
42 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"long with H"pecific practice"I and H"pecific goal"I# as implicitly referred aove# -MM$ identifies
Hgeneric practice"I and Hgeneric goal"I! &hese lasts descrie the characteristics that must e
present to institutionaliEe the practices that are implemented for a process area! ,eneric
practices are activities that ensure that the practices associated with the process area will e
effective# repeatale and lastingF they are named generic ecause they appear in multiple
process areas! For our purpose and from a formaliEation standpoint# the distinction etween
specific and general practices and goals is marginally important! -onceptually# each application
of an activity# including the ones referale to generic practices# is an instance of impact vector!
$mpact vector attriutes .i!e! the coordinates of each impact vector/ do not have an eGuivalent
in -MM$# which postpones measurement to maturity levels A and B and delegates to process
managers the identification of suitale metrics and their management# interpretation and use!

Figure 4 -MM$ levels and capailities
SummariEing# concerning companies interested in achieving -MM$ levels# impact vectors may
play a useful role as soon as an organiEation aims to achieve levels A or B: use of collected
metrics for process improvement are effectively and efficiently viale y using the impact
vectorsF more specifically# the main targets of the $mpact 0ector Model are the following
process areas:
43 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

HHuantitative ProFectS8or2 ,anagementI .6A/#
HCrgani1ational Proce"" PerformanceI .6A/# and
HCrgani1ational Performance ,anagementI .6B/!
During maturity levels lower than A# impact vectors can still e leveraged for the following
process areas:
H,ea"urement and !nal0"i"I .69/#
Pro)ectJ@ork Monitoring and -ontrol .69/#
Pro)ectJ@ork Planning .69/#
Process and Product =uality "ssurance .69/#
$ntegrated Pro)ectJ@ork Management .6>/#
2rganiEational Process Definition .6>/# and
Decision "nalysis and 5esolution .6>/!
"t -MM$ 69 and 6># the $mpact 0ector Model can provide Gualitative support and a structured
approach of addressing decision prolemsF at 6A and 6B# instead# with data eing collected
systematically# decisions should e ased on Guantitative information and impact vectors can
e totally e*ploited for control and enhancement purpose! $n fact# they can e comined to
produce prediction models for different performance attriutes# including Guality attriutes#
cost for activities and su-processes# and every attriute reGuired to perform a Guantitative
software development management!
IS!4IEC '--(4 infor;ation te=hnoloCF H Pro=ess assess;ent X3:Y# also known as Soft.are
Proce"" )mprovement and Capa#ilit0 Determination ?SP)CE@# is a set of technical standards
documents for software development process! $n SP$-E# processes are organiEed in five process
Hcategorie"I: customerJsupplier# engineering# supporting# management# and organiEation! Each
process is assigned a Hcapa#ilit0 levelI etween : and BF the level is measured using nine
Hproce"" attri#ute"I# each consisting of one or more Hgeneric practice"I! &hen# each generic
practice is composed of Hpractice indicator"I# used to assess performance! Performance
Ha""e""mentI utiliEes the degree of fulfillment of process attriutes# as measured in the scale
N%ot-achieved# Partially-achieved# 6argely-achieved# Fully-achievedPF values of this scale are
assigned to attriutes ased upon evidence collected against the practice indicators! $n other
words# while the -MM$ appraisal is standardiEed# the SP$-E assessment is not entirely
standardiEedF in fact# the conversion from an indicator to the degree of fulfillment is
standardiEed# ut the translation of collected evidence into an indicator is not unamiguously
defined! 'owever# some form of tailoring and fle*iility is accepted .and actually planned/ for
oth evaluation processes!
44 | S o f t wa r e De v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 7 $S2J$E- 3BB:A levels and capailities
Similarly to the comparison with -MM$# also for the case of SP$-E impact vectors can represent
a good fit for supporting process management and improvement since the eginning of the
path .level 3/F however# only at the highest capaility levels they ecome most useful# when
measurement# control and predictaility ecome keywords for further process enhancement!
SiE SiC;a is a set of tools and techniGues for
process improvement originally developed
y Motorola in 3743 for its manufacturing
and #u"ine"" processes! 2ne of the
assumptions of Si* Sigma is that the
development process has Hcharacteri"tic"I
that can e measured# analyEed# controlled#
and improved! Based on this assumption# Si*
Sigma encourages continuous efforts to
enact a predictale development process#
which has to e measured# and includes the
definition of Guantified value target"# e!g!
proce"" c0cle time reduction# increa"e of
cu"tomer "ati"faction# or co"t reduction!
0alue targets correspond to what we call
PP$0# hence ,=M goals# and the term
HcharacteristicsI to e measured is
eGuivalent to our HattriutesI! "lso in Si*
Figure 3: Si* Sigma Process
4- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Sigma# as well as in the $mpact 0ector Model# identifying characteristics that are
Hcritical to 3ualit0H .the so called -=s/ is one of the most important steps to perform either to
identify or to improve# depending on the methodology eing .DM"D0 for process definition or
DM"$- for process improvement/! "dditionally# in the case of definition of a new process .or# y
e*tension# the introduction of new su-process in an e*isting process/# it reGuires the
identification of alternatives to implement the process# which is something we have already
focused on in the previous chapter on ,=M?S# when we introduced the concept of choice
among different strategies to augment the ,=M?S methodology!

42 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g
























Page intentionally left lank!
49 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g







Section IV: Section IV: Section IV: Section IV: I$%#ct I$%#ct I$%#ct I$%#ct
Vecto" Vecto" Vecto" Vecto" F"#$e*o"+ F"#$e*o"+ F"#$e*o"+ F"#$e*o"+
I)n almo"t ever0thing, e*perience i" more valua#le than perceptJ.
,arcu" 4ulliu" Cicero
4, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n Section $0# we present a tool chain that supports the use of the $mpact 0ector Model in
practical conte*ts! &he first tool of the tool chain K $mpact 0ector Schema - is a piece of Zava
software of our creation# while the rest of the tool chain leverages e*isting tools currently used
for different purposes!
Chapter XII. "mpact #ector Schema: )rgani7e and Store
"mpact #ectors
For the realiEation of a tool for organiEing# storing# and managing impact vectors# we took
inspirations from a paper y Sira 0egas and 0ictor Basili: H" -haracteriEation Schema for
Software &esting &echniGueI# which focuses on software verification techniGue
characteriEation! "ccording to such paper# a verification techniGue can e characteriEed y a
hierarchy of levels# elements and attriutes! Each techniGue is characteriEed across several
levels# each level is composed of several elements and each element is composed of several
attriutes! Each attriute is assigned a value of some data type!
&he term Hattri#uteI in the aforementioned paper has a meaning alike the one we use in the
$mpact 0ector ModelF in fact# if we admit that the value of an attriute can e a proaility
distriution function on a scale# the term HattriuteI makes sense oth in the 0\0 schema y
0egas and Basili# and in the $mpact 0ector Model! 'owever# the 0\0 Schema admits free te*t
values for its attriutesF even though the $mpact 0ector Model does not e*plicitly forid the use
of free te*t for some attriutes# it is recommended that admissile values e machine-
understandale# so to enale automatic analysis! -onseGuently# the $mpact 0ector Model
reGuires a higher degree of formalism in the techniGue characteriEation than the 0\0 schema!
"lso# 0\0 schema e*pects to have one sheet .i!e! one instance of the schema/ for each
verification techniGue! $n the case of the $mpact 0ector Model# instead# we e*pect to have a
numer of impact vectors referring to the same techniGues# i!e!# an impact vector for each
e*ecution of that techniGue! SuseGuently# we can comine such impact vectors referring to
the same techniGue and uild a new impact vector that represents the e*pected performance
of that techniGue# so to help predict the performance of such techniGue in new applications! $n
the 0\0 schema# knowledge aout the activity is aggregated in a single sheet .e!g!# the cost of a
techniGue falls appro*imately in a given range# which is supposed to e true for every e*ecution
of that techniGue/# which is good for humans# ut not precise enough for the automatic
processing purpose we defined nor for fitting the $mpact 0ector Model approach!
$n conclusion# in order to realiEe a tool to store impact vector effectively# we can take
inspiration from the 0\0 schema# ut we need to place additional constraints# i!e! the ones
4: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

defined in the model formaliEation of the $mpact 0ector Model! Furthermore# we need to
widen the focus to including techniGues other than verification! "lso# structuring attriutes in
Helement"I and Hlevel"I may represent a good idea to cope with management issues: when the
numer of attriutes ecomes hardly treatale y a single human# having them organiEed
hierarchically could e an effective way of focusing the attention on those dimensions most
relevant to the specific conte*t! $n order to allow such structure# however# we do not need to
add further formalism in the $mpact 0ector Model: we intend elements as sets of attriutes and
levels as sets of elements# ut nor elements nor levels do offer specific properties or do have a
peculiar theoretical meaningF they are )ust a way of aggregating attriutes in an easier-to-
manage hierarchy!
2ne last remark is that one could e*pect a first identification of some relevant attriutes with
their conseGuent organiEation in levels and elements! 'owever# we recall that the importance
of an attriute strictly depends on the conte*t of application! &his means that an attriute
which is important in a particular conte*t may have no interest in a different conte*tF for
e*ample# the Hall-p-use coverageI metric of an automatic test case generation tool may e
relevant for a safety-critical pro)ect# where such a coverage metric may e mandatory to collect
and ma*imiEe up to a given value# ut it may have no interest for a we entertainment
application pro)ect! &herefore# we do not want to propose here a set of attriutes and their
structureF we rather postpone such a task to Section 0# where we will report on some practical
applications of the $mpact 0ector Model and will show some attriutes we identified!
$n the following of this chapter# we list the main reGuirements we elicited and implemented for
$0-schema# i!e! the tool we realiEed to store and structure the impact vectors!
/""%i% "mpact #ector Schema: Main Re8irements
5eGuirements are organiEed in two sets: the first set includes reGuirements for managing the
impact vectors .data/# while the second set includes reGuirements for managing the structure of
the data .dimensions and their hierarchical organiEation/! &he following synthesis lists some
main reGuirements implemented for the eta version!
"! Managing the structure of the data
a! Manage attriutes
i! 6ist attriutes
3! By techniGues giving it a value: not all impact vectors have a valid
value for each attriute# as some attriutes make no sense for
some impact vectors .e!g! all-p-uses coverage for a risk
management activity/
! Manage activities
-( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

c! Manage scales
d! Manage elements
e! Manage levels
B! Managing the impact vectors
a! 6ist
i! By techniGue
ii! By attriute
iii! By level
! -reate and associate to a techniGue
c! Edit
i! For a single impact vector# i!e! those attriutes not shared y other
impact vector ecause referred to a specific instance of an activity .e!g!
efficiency attriutes/!
ii! For all impact vectors referring to the same activity .e!g! name# element
or description/
d! Delete impact vector
e! E*port to datasheet
HManageI reGuirements include create# read# update and delete operations# as well as e*port
functions!
/""%ii% "mpact #ector Schema: $rchitectre and Technolog'
&ypical Software Engineering est practices were employed in the realiEation of the tool# along
with well-known technologies and frameworks! Zust to mention some of them: a tailored
version of $BM 51P as development process .iterative and incremental/F Zava as development
languageF ZBoss Seam as development framework# configured to work with ZSF and 5ichFaces#
Zavascript and ")a* for the view layer .other than '&M6?-SS/# and 'iernate for the data layer!
" view of the data model is reported in the following E-5 diagramF the 5DBMS used for the
current version is MySGl# ut a migration to DB9 is planned for reasons which will e made
e*plicit later in this work! For an easy understanding# a HrecordI in the model represents an
impact vector instance# while a HvvnodeI represents a techniGue! H0alueI and HdatatypeI#
instead# are the way we currently represent a scale# i!e! y means of a memer .value/ and its
semantics .datatype/F for this eta version# this representation is enough# ut in the future it
will reGuire refactoring and improvement in order to accommodate the implementation of
algeras for attriutes!
&he rest of the entities and relationships is intuitive and matches our ta*onomy!
-' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 33! E-5 diagram of $0-Schema
&he usiness logic is implemented via the 'ome -ontroller pattern# heavily used y ZBoss SeamF
the usiness logic model is reported in the following 1M6 class diagram!

Figure 39! Business 6ogic 6ayer: 1se of the 'ome -ontroller Pattern
-2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&he application server used to test the tool is ZBoss "S and the integrated development
environment is Eclipse! &he complete and e*act technology stack is as follow:
MyS=6 -ommunity Server! vB!3
MyS=6 @orkench! vB!9
MyS=6 Zava -onnector vB!3
Eclipse $DE for Zava EE Developers! v!>!; 'elios S59
ZBoss &ool! v>!9
ZBoss "pplication Server! v!;!3!:!Final
ZBoss Seam! v9!9!9!Final .including 'iernate/
"pache "nt! v3!4!9-A
&he current version of the tool can e found on www!mastrofini!itJiv-schema!
Chapter XIII. (sing "-M Cognos "nsight9 to #isali7e1
E&plore1 and $nal'7e "mpact #ectors
2nce impact vectors and their structure are e*ported as E*cel files# they can e imported in
more sophisticated tools for visualiEation# e*ploration and analysis! $n particular# we identified
$BM -ognos $nsight] X39Y as an adeGuate tool for managing multidimensional data# doing
reporting and facilitating visualiEation and e*ploration of impact vectors!
-ognos $nsight is a tool for Business $ntelligence .B$/# where y B$ we mean a set of theories#
methodologies# processes# and technologies to transform raw date to information useful for
usiness performance improvement! -ognos $nsight permits to study usiness performance#
identify patterns and trends in the data# and plan improvement strategies! "mong its main
concepts# we highlight the following: .i/ .or2"pace: a composition of graphical elements
.widgets/ for the analysis# e*ploration# filtering# and visualiEation of usiness dataF .ii/
dimen"ion: set of pieces of information that descrie an aspect of the usiness# e!g!# geographic
dimension or time dimensionF .iii/ level: a slice of a dimension# e!g! region# nation# and city are
three levels of the geographic dimensionF .iv/ mea"ure: a Guantifiale performance indicator#
e!g!# sell price or marginF .v/ cu#e: a multidimensional representation of usiness data like
dimension and measure!
-ognos $nsight can import data from csv# *ls# *ls*# t*t# asc# ta# cma file formats# ut also from
proprietary formats and from $BM dataases! For this reason# we plan to port the $0-Schema
tool from MyS=6 to DB9# which is natively supported y -ognos $nsight! @hen importing data#
-ognos $nsight allows to map data to match the planned structure with its inner representation!
-3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n particular# -ognos $nsight tries to automatically map data to identify hierarchies and
measures! &his is an important feature: as we have seen in the previous chapter# in fact# we
want to group and organiEe impact vectors in hierarchies .i!e! level# elements and attriutes/!
@ith the only issue of ta*onomy mismatches# we can easily map $0-Schema .or# eGuivalently#
0\0 Schema/ memers to $BM -ognos $nsight memers! @hat -ognos $nsight calls HmeasureI
is actually a dependent variale# i!e!# a performance indicator that has to e optimiEed in order
to increase the usiness value produced y the company! $n the $mpact 0ector Model# every
attriute can play the role of -ognos $nsight measure# ut# of course# some of them make more
sense# depending on the conte*t and on the needs! &herefore# during the mapping step# a
decision maker can indicate which attriutes eing imported should e represented as
measures in -ognos $nsight!
" limitation of -ognos $nsight is that it can manage non nominal scales of measure! 'owever#
nominal scales can e dealt with as non-measures and otain an eGuivalent effect in terms of
visualiEation# e*ploration and analysis capaility! 2ne more limitation# instead# is that the data
types Hproaility distriution functionI and Hmass functionI are not managed! 2n the other
side# visualiEing this type of data may e prolematic and we are currently investigating ways to
cope with this issue# as well as whether resolving this is issue can ring some added value to the
tool chain and to the entire framework!
"dditional features of the tool include: drill up and drill downF sortingF e*ploration points .for
filtering and organiEing data/F calculationsF time rollup! -ognos $nsight offers a numer of chart
types# that can e automatically generated starting from the availale data! Slicing and dicing of
data is really efficient when done via -ognos $nsight and it reveals a numer of aspects of
interest of the data under study!
Finally# commercial versions of -ognos $nsight also allow to share workspaces and strategies
among users# so to enale spreading the information across the company and a collaorative
approach to data production and provisioning# according to the estalished access rights to the
platform!
'owever# -ognos $nsight only supports Gualitative analyses and reporting# so additional tools
are needed for Guantitative statistics# hypothesis testing and data mining! $BM SPSS Modeler
can e integrated with -ognos $nsight# as the first can read and write data for the latter! SPSS
supports the creation of predictive analytics and advanced statistical analysis# so to
complement most of the functionality reGuired for data analysis! 2n the side of $BM tools# there
are many other offerings# oth commercial .e!g! y Microsoft and 2racle/ and freeJopen source
.e!g! @eka/! For time constraints# we did not yet perform an e*tensive comparative study to
identify the statistical and data mining tools that est fit the structure of our model!
-4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

'owever# once we )ump in the statistical or data mining world# the concept of impact vector
ecomes less visile and has a minor influence on the way such analyses are performedF
therefore# we do not need not investigate this aspect any longer in this work and redirect the
interested reader to more focused ooks and manuals!
Chapter XIV. "-M Method Composer9 : "-M Team
Concert9: "nfse1 Estimate1 Manage1 Measre1 and
"mpro*e with "mpact #ectors
2ne last point of interest# on which we want to focus for the definition of a tool chain for
impact vectors# is on how to concretely employ the $mpact 0ector Model via e*isting tools used
to manage the development process!
&he first tool we want to focus on is $BM 5ational Method -omposer .5M-/# a fle*ile process
management platform# uilt on top of Eclipse# with a method authoring tool and a process
asset lirary to help implement measured improvement!
&hrough 5M-# one can create# tailor# manage# and pulish process descriptions! $n particular#
among the many features# we want to highlight the possiility of creating estimation models! "s
e*plained in the $BM information center .http:JJpic!dhe!im!comJinfocenterJrmchelpJv<rBm3/#
an estimating model is a way of calculating estimates that can use any numer of estimating
factors! "n estimating factor is something that has a Guantitative influence on the effort
reGuired to perform a task or activity! E*amples of estimating factors are use case comple*ity
.comple*# medium# simple/# usiness areas# platforms# screens# classes# use cases numer! $n
order to compute an estimate# estimating formulas are used! &he estimating formula is a
relatively simple algeraic eGuation and the estimating factors are the parameters used in that
eGuation! "fter the estimating factors are defined# the estimation formula can e used to
calculate the estimated effort reGuired to complete a task or work reakdown element! &he
result of the calculation is typically e*pressed in hours!
Estimating factors can e associated with a task or activity# and estimates and summary values
can e computed via a predefined list of formulas that cannot e edited! "s well as for the
$mpact 0ector Model# 5M- provides a ottom-up estimating approach! Estimates are created
for tasks in a work reakdown structure .@BS/ for a pro)ect and summed through the @BS to
arrive at an estimate for the whole pro)ect! &he depth where the ottom up approach is
selected is not necessary a leaf of the @BS# ut one can select any level and then clim the @BS
structure up to the root of the pro)ect!
-- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

SummariEing# 5M- fi*es the set of comination functions to a predefined one# allows to
estimate the effort of an activity or task# and supports the definition of estimation factors .i!e!#
impact vector attriutes/! Even though the comination function is a crucial part of the $mpact
0ector Model and the effort is not the one dependent variale we are interested in estimating#
5M- resulted as the most useful tool to infuse the $mpact 0ector Model in the practice of an
enterprise! $ts native integration with 5ational &eam -oncert .5&-/# another $BM tool for
lifecycle management# allows carrying the impact vectors throughout the process tasks in
teams+ everyday work that can e characteriEed# planned# monitored# measured# and improved
with a fine-grained development process Guantitative management!
For further details on 5M- and 5&-# we demand the reader to the ne*t section# where
e*amples of application and integration etween these tools and the $mpact 0ector Model are
proposed!
"gain# we want to remark that our survey of the state of the practice was not e*haustive# ut
rather driven y our industrial partners! For the future# we plan to investigate new tools and
alternatives to support the $mpact 0ector Model!
$n conclusion# as of today# the tool chain suggested for using the $mpact 0ector Model in a
practical conte*t is represented in Figure 3>!

Figure 3>! &he recommended tool chain end-to-end for using the $mpact 0ector Model!


-2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g
























Page intentionally left lank!
-9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g



Section V: Section V: Section V: Section V: I$%#ct I$%#ct I$%#ct I$%#ct
Vecto" Moe&: Vecto" Moe&: Vecto" Moe&: Vecto" Moe&:
E)%e"i$ent#tion E)%e"i$ent#tion E)%e"i$ent#tion E)%e"i$ent#tion

-, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n this Section# e*cerpts from our pulished papers and our work e*ecuted at companies+ are
reported# as documentation of the practical application of the $mpact 0ector Model!
Even though we did not have the time to fully employ impact vectors in a single# encompassing
conte*t# we actually had the chance to e*periment nearly all parts of the framework in isolation
and in different conte*ts# with different goals and different issues! &herefore# for presentation
reasons# we made the decision to create an astract company# the $mpact-2ne# and report our
real e*periences as they were performed in this conte*t!
Furthermore# given the criticality and sensitiveness of the content of our studies# we will often
ofuscate the results and provide only partial views# according to the restriction and
classification policies enacted y our partners! So# the reported results# though representative
and realistic# do not necessarily reflect the reality!
@e catch the opportunity to mention the organiEations that contriuted to the e*perimental
validation of our model and supported our investigations! $n order of intervention during the
validation process:
1niversity of 5ome &or 0ergata# $taly .9:3:-9:3>/F
MBD" $talia# $taly .9:33-9:39/F
Fraunhofer -enter for E*perimental Software Engineering - Maryland# 1S" .9:39-9:3>/F
%"S"# 1S" .9:39/F
$BM $talia# $taly .9:39/F
Deymind# 1S" .9:3>/!
Chapter XV. "mpact,)ne: #irtos Compan' ;e*eraging
the "mpact #ector Framework
/#%i% First Steps for $ppl'ing the "mpact #ector Model: Conte&t
&he $mpact-2ne is a philanthropic# young# innovative and florid organiEation operating in the
field of comple* medical devices development! &he devices developed y the company operate
in an integrated way# so to form a network of devices collecting a lot of data from many
patients! Data are stored on some servers geographically distriuted that interact and e*change
data transparently to the devices! &hen# $mpact-2ne technology supports physicians for their
diagnoses via a distriuted we application# which assures privacy to patients while
interactively guiding the physician through the evaluation of plausile alternatives via data
e*ploration# including data collected on other servers!
-: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&herefore# $mpact-2ne $& progress reGuires a diverse and road set of skills within the software
division!
&he company has so far outsourced the development of all pieces of software to a couple of
companiesF however# the software development# though not under direct control of $mpact-
2ne# has always een e*ecuted in tight connection and continuous collaoration with the rest
of the company teams! "s of now# management has decided to incorporate software
development into the company as an internal# new division# with the goal of reducing software
development costs while maintaining the current Guality of produced software! $n order to
achieve this goal successfully# the =uality Division has een assigned a new software Guality
assurance head: Zed SGua! SGua+s goal is to support the company at defining a sustainale#
predictale# innovative# and high-end software development processF he is highly skilled in the
impact vector model methodology!
@ithout entering the organiEational details of all the $mpact-2ne+s divisions# we )ust want to list
them and provide some overview information:
Development division .cyan sGuares and yellow sGuares on the left side of Figure 3A/#
including the > groups of Electronics# Mechanics# and System .cyan/# now e*tended to A
groups with the inclusion of Software .yellow/F
=uality division .green and rown sGuares# in the middle of Figure 3A/# with one =uality
"ssurance .="/ group per development group .green/# including software =" .rown/F
'uman 5esources division .dark lue sGuares on the right/F
Marketing division .very dark lue sGuares on the right/F
Finance division .light lue sGuares/F
"dministration division .very light lue sGuares/# with its 6egal and "ccounting groups!
$f we focus on the Software Development group within the Development Division# the following
teams e*ist: "rchitecture and "nalysis .> people/F Design .A people/F -oding .33 people/!
Software =" .S="/ group within the =uality Division is composed of 4 people: a S=" lead .i!e!#
Zed SGua/# a team leader for each used software verification techniGue .i!e!# test# formal
methods# simulation# and inspections/# a responsile for software reliaility# one more for
software safety and the last one for software process! S=" team leaders have the authority to
reGuire that Development people e assigned to assurance activities! Software Development
group and S=" group form the so called software intergroup!
Since many consultants have een working full time at the company for several months# some
of them were acGuired as employees and already have a good aseline e*perience with the
domainF however# a few new people were also hired and assigned to the Software
Development group!
2( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 3A 2rganiEational Structure
"ll development groups# ut software# have a well-estalished development process# clear
standards to follow# and interaction dynamics among groups and teams are consolidated! &he
part of those dynamics that is most interesting to the new software group is the interaction
with the system group# which takes care of: .i/ system reGuirements definition# reakdown and
allocation to susystemsF .ii/ system integration for putting together the developed susystems#
assess and possily forward change reGuests coming from other development groups!
&herefore# the software team is e*pected to integrate with those dynamics and get to e part
2' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

of the game! &his lets SGua elieve that the most critical point e pro)ect time: a not well
consolidated team# whose interaction dynamics are not estalished nor known# may lead
stretching development time une*pectedly! For this reason# particular care to pro)ect time
control shall e taken! 2n the other side# it would e unreasonaly e*pensive to implement the
same development process oth for the emedded software and for the we platform!
&herefore# a very rigorous process shall e proposed for the emedded software# while a more
fle*ile and free process shall e implemented y the teams working on the we software!
/#%ii% First Steps: $cti*it' "dentification1 Definition1 and
Characteri7ation
2nce given the general description of the company# we focus on the software intergroup! More
specifically# we want to egin y focusing on the work of estalishing a development process
for the emedded softwareF this step starts y identifying and defining process activities
reGuired to e performed! "s part of activity definition# also their characteriEation in terms of
measurale indicators is e*pected! $n the ta*onomy of the $mpact 0ector Model# Dr! Zed SGua
well knows that this means to define the set of attriutes we are interested in considering in
the conte*t under study! &his is the first# mandatory step to start applying the $mpact 0ector
framework!
"fter a deep analysis of the state of practice and the state of the art# under suggestion of Dr!
SGua# the company proposed to adopt the $BM 5ational 'armony Process for Emedded
Software] X3>Y# tailored to accommodate the organiEational structure# conte*ts# needs# and
goals! "side this process# the $BM 5ational 5hapsody] tool was selected as $ntegrated
Development Environment! "fterwards# a picky analysis of each single activity and task in
'armony has started# along with a review of the related literature with the goal of concretely
and precisely characteriEe the activities+ performance in a measurale way!
"s an e*ample# if we take the task HDefinition of the architectureI# the identified metrics of
interest are:
"verage time spent to define each architectural elementF
"verage time spent to define all communication links for a given architectural elementF
"verage time reGuired to generate all architectural elements related to a single use
case!
$t should seem ovious that those metrics are some sort of dependent variales# ecause we
would like to tune the activity K i!e!# to manipulate some other attriutes of the activity K to
optimiEe those metrics .i!e!# to reduce the average times as much as possile without
compromising the Guality/! "ttriutes to manipulate include:
22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Presence in the architecture definition team of an architect with more than 9: years of
e*perience# a Boolean valueF
Presence in the architecture definition team of an architect with at least 3: years of
e*perience# a Boolean valueF
Presence in the architecture definition team of an architect with at least B years of
e*perience# a Boolean valueF
%umer of people directly involved in the definition of the architecture# an $nteger
valueF
"verage numer of years of e*perience of all involved architects# a 5eal valueF
Degree of innovation of the selected technologies .e!g! frameworks or tools/# a value in
an ordinal scale: very low .no unused technology/# low .3 new technology selected# for
which a well historical record of application e*ists in literature/# medium .9-B new
technologies selected# for which well historical records of application e*ist in literature/#
high .3 new technology selected for which no historical record of application e*ists in
literature/# critical .more than one new technology selected for which no historical
records of application e*ist in literature/
"verage numer of reGuirements change reGuests generated per architecture element!
So# the first three metrics# along with the last ; metrics# are dimensions of the space in which
the impact vectors will live and evolve!
Every task of our tailored version of 'armony will provide its contriution to the $mpact 0ector
Space y possily introducing further dimensions! So the numer of dimensions can increase
significantly! 'owever# not all dimensions are relevant to every task of 'armony: this can e
managed y assigning to the insignificant dimensions their Eero value! E!g!# the average numer
of reGuirements change reGuests generated per architecture element could have an irrelevant
influence on the performance of tasks like unit testing .actually# it does not even make any
sense for such a task/! Based on the formulation of the $mpact 0ector Model# Dr! SGua knows
that this fact does not represent a ig deal: the analysis of the data collected overtime will elicit
this fact when uilding the "ctivity Performance Function .we will later see how that happens/!
$n this paragraph# we have given a partial e*ample on the type of activity characteriEation! $n
the ne*t paragraph# we provide a much more detailed description of the proposed process#
including not only activities# ut also tools# roles# and uilt-in metrics that can help address our
main goal: pro)ect time control! @e also show how all descriptions were inserted in $BM
5ational Method -omposer]# the tool we mentioned and recommended for use with $mpact
0ectors! %evertheless# we do not focus on the independent variales# i!e! those characteristics
23 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

of the activities that need to e tuned in order to achieve the planned goal! &his further aspect
.i!e! the independent variales/ is the o)ect of the ne*t Section!
Chapter XVI. Tailoring the "-M Rational <armon' for
Embedded Software
$n this part of the work# we elaorate on the $BM 5ational 'armony process for the
development of emedded devices# as the ones eing developed y $mpact-2ne!
/#"%i% Process Description
"s previously mentioned# Systems Engineers provide the other development groups# including
the Software ,roup# with some susystem reGuirements specification! Starting from these
reGuirements# the Software ,roup is e*pected to e*ecute the following 9: activities# which we
group y < phases!
$! System Requirements Analysis
"3! "nalyEe Stakeholder reGuestsF
"9! Define System validation planF
">! Specify 5eGuirements and trace them .e!g!# &raceaility matri*/ to Stakeholder
reGuestsF
"A! 5eview 5eGuirements# &raceaility# System validation plan# and perform risk
analysis and produce 5isk analysis document!
$$! System Functional Analysis
"B! $dentify and trace 1se cases to 5eGuirements .-onte*t Diagram# -D# and 1se
-ase Diagram# 1-D/!
";! Define Black o* use case design ."ctivity Diagrams# "D# SeGuence Diagrams# SD#
$nterfaces and Ports diagram# $\P# State Machines# SM# and Black-o* Definition
Diagram# BDD/!
"<! Define and trace System test plan!
"4! 0erify Black o* use case design through e*ecution and produce related 5eports!
"7! 5eview Black o* use case design# System test plan# and 5eports# and update
5isk analysis document!
$$$! Design Synthesis
"3:! Define System architecture .$nternal Block Diagram# $BD/!
"33! Define @hite o* use case design ."D?SD?$\P?SM?BDD?$BD/!
"39! Define System integration test plan!
24 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"3>! 0erify @hite o* use case design through e*ecution and produce related
5eports!
"3A! 5eview System architecture# @hite o* use case design# System integration test
plan# 5eports# and update 5isk analysis document!
$0! Unit Implementation and Test
"3B! Develop or generate units and trace them to @hite o* use case design!
"3;! Define and trace 1nit test plan!
"3<! Perform unit test# identify ug resolution strategies# and correct prolems!
0! System Integration Test
"34! Perform system integration test# identify ug resolution strategies# and correct
prolems!
0$! System Test
"37! Perform system test# identify ug resolution strategies# and correct prolems!
0$$! System Validation
"9:! Perform system validation!
Figure 3B is a graphical representation of the process# reporting phases# activities# generated
artifacts# allowed paths# and decision gates!
&he process is incremental# in the sense that the system can e decomposed into susystems
.or parts/ which may e developed independently# either y different teams# at different time#
or in different increments# depending on schedule and priorities! 2viously# for each
decomposition step# a corresponding integration step shall e planned!
&he process is completely iterative! Specifically# it is within-phase iterative and etween-phase
iterative!
$t is within-phase iterative ecause each one of the phases is iterative! "ctivities within that
phase may e e*ecuted as many times as needed in order to achieve the reGuired level of detail
and fi* all encountered prolems# without enacting any particular policy! E!g!# when specifying
reGuirements .activity "># Phase HSystem 5eGuirements "nalysisI/# it is possile to come ack
to the system validation plan and update it!
&he process is etween-phase iterative ecause it is possile to go ack to previous phases as
many times as needed! $n fact# etween each couple of phases# there is a decision gate! ,iven a
phase n# at its end# the sunny day path .green outgoing arrow from the decision gate/ allows to
move to the ne*t phase n=1# whereas the rainy day path .red outgoing arrow from the decision
gate/ takes ack to the previous decision gate# i!e! the one etween phase n-1 and phase n!
Such a decision gate chain creates the possiility to rollack to previous phases of the process!

2- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 3B &ailored 'armony Process Description
22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

'owever# developers cannot freely go ack from any activity of a given phase to an activity of a
different .previous/ phase! $n order to go ack to an activity of a previous phase# in fact#
developers need to go through the decision gate at the end of the phase they are currently in!
&his is not a limitation# ecause the process is incremental and each one phase is iterative# thus
there is no risk to fall ack to a waterfall process model! Furthermore# it is ovious to notice
that# when an issue is detected# developers may make some steps ack up to the last passed
decision gate!
&he presence of decision gates# which are the uniGue points to switch from one phase to
another# helps enforce the process! Suppose that# for e*ample# during activity "33# the team
detects a defect# e!g! a missing use case! &he team goes ack to decision gate ,># where the
following Guestion shall e answered: $s the detected defect caused y Hmissing or incorrect
lack o* use case designLI! $n case it is# the team goes ack again# up to decision gate ,9!
Suppose that no reGuirement is missing# thus activity "B H$dentify use case and trace to
reGuirementsI is performed# where the identification of the new use case is followed y its
tracing ack to reGuirements! $f a free iteration etween activities of different phases were
allowed# then developers could have directly performed "; and possily skipped some
fundamental activities# i!e! traceaility! Enforcing the developers through decision gates aims to
avoid such kind of situations and keep the process consistent!
&o pass through the milestones in the following enales the team to proceed to the ne*t phase!
For each iteration# a milestone needs to e planned and met# in order to pass through the
decision gate! 2ne more milestone .the first/ is also planned after the stakeholder reGuests
analysis!
M3! 1nderstood Stakeholder reGuests!
M9! 5eviewed and approved 5eGuirements# &raceaility# System validation plan# and 5isk
analysis document!
M>! 5eviewed and approved Black o* use case design# System test plan# 5eports# and 5isk
analysis document!
MA! 5eviewed and approved "rchitecture# @hite o* use case design# System integration
test plan# 5eports# and 5isk analysis!
MB! $mplemented units# and performed and passed unit test!
M;! $ntegrated units# and performed and passed system integration test!
M<! $ntegrated all units# and performed and passed system test!
M4! Performed and passed system validation!
"s part of the =uality $mprovement Paradigm# i!e! the approach leveraged to define the
proposed process# also a set of metrics for goal-driven process measurement were defined:
metrics aim to provide Guantitative data to assess the time reGuired y of each process activity!
29 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&his should# in turn# provide information to pro)ect leaders to control the process and to pursue
the goal of reducing pro)ect time! &he proposed metrics are the first step for the definition of a
measurement strategy# which planned to e consolidated during the ne*t iteration of the
proposed methodology!
!"#$i$%$ Process Description: an E&ample
Figure 3; shows an e*ample of the application of our tailored 'armony process! Specifically#
the e*ample is composed of two nested e*ecutions of the process: the outer e*ecution is at
system-level# the inner at software level! Process phases are named accordingly to the 'armony
process# i!e!# 5eGuirements "nalysis# Functional "nalysis# and Design Synthesis! "t the end of
the e*ecution of the last phase at system level# unit implementation should e planned! "s
e*plained in paragraph HDevelop or ,enerate 1nits and &race to @hite Bo* 1se -ase DesignI#
such activity may encompass a further application of the process at a lower level of details!
Specifically# it consists in the development of a software susystem! Software susystem
development passes through the same phases as system development! &his time# the unit
implementation .i!e! software implementation/ consists in generating the source code!
0erification phases and artifacts# as well as non-functional reGuirements management# are not
reported for presentation reasons! $n short# the process e*ecution is:
System 5eGuirements "nalysis: given the Stakeholder reGuirements# System
reGuirements are derived!
System Functional "nalysis: ased on System reGuirements# the System use cases
group System functional reGuirements at Black o* design level# and SysM6 diagrams
specify system functions!
System Design Synthesis: for each System use case# a set of SysM6 System locks is
created and detailed# each representing a part of the system! &his way# System white
o* design is created!
Software 5eGuirements "nalysis: a suset of System reGuirements is allocated to
software# and the Software 5eGuirements Engineering discipline derives Software
reGuirements from System reGuirements!
Software Functional "nalysis and Design Synthesis: ased on Software reGuirements#
the Software use cases group Software functional reGuirements! Each Software use
case shall also derive from a System use case and implement it in toto or part! &his
way# Software lack o* design is created!
For each Software use case# a set of 1M6 -lasses is created and specified# each
deriving or eing part of at least one SysM6 System Block! &his way# Software white
o* design is created!
2, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

,iven 1M6 -lasses# source code .i!e!# !c# !cpp and !h files/ is generated!
Figure 3; also shows traceaility relationships among artifacts produced in the process phases!

Figure 3; Process "rtifacts# Documentation and &ools across an e*ample of instantiation of the development life cycle
/#"%ii% Roles in )r Tailored <armon' Process
&he main roles to assign during process e*ecution are at least < and they are listed in the
following!
3! Stakeholder# who represents an interest group whose needs must e satisfied y the
pro)ect! &his role reGuires specific domain knowledge and may e assigned to one or
more staff memers# even though such staff also has one of the following roles:
5eGuirements Specifier or &ester!
9! Proect !anager# who plans# manages and allocates resources to tasksF shapes
prioritiesF coordinates interactions with customers and usersF keeps the pro)ect team
focusedF selects tools# documents to useF decides documentation to generateF
2: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

delierates achievement of a goal or accomplishment of a taskF schedule meetings# peer
reviews and milestone reviews as appropriate! &he Pro)ect Manager also estalishes a
set of practices that ensure the integrity and Guality of pro)ect work products! " Pro)ect
Manager may also e a Software "rchitect! Process mastering!
>! Requirements Specifier# who specifies and maintains the detailed system reGuirements#
including non-functional reGuirements and system constraints! SJhe specifies and
maintains use cases# i!e! Black-Bo* 1se -ase Design! SJhe understands user needs! &he
role may e assigned together with the role of &est Designer!
A! Test Designer# who identifies# defines and conducts the reGuired tests! &his includes
identifying and descriing the appropriate techniGues# tools and guidelines to
implement the reGuired tests! &he role also represents stakeholders who do not have
direct or regular representation on the pro)ect! " &est Designer may also e a Software
"rchitect# or a Designer!
B! Architect# who has overall responsiility for driving the ma)or technical decisions#
e*pressed as the architecture! &his typically includes identifying and documenting the
architecturally significant aspects of the system# including reGuirements# design#
implementation# and deployment ^views^ of the system! &he architect is also
responsile for providing rationale for these decisions# alancing the concerns of the
various stakeholders# driving down technical risks# and ensuring that decisions are
effectively communicated# validated# and adhered to! "lso# it is responsile for planning
the integration! $f the pro)ect is large enough an architecture team may e created! For
smaller pro)ects# a single person may act as oth pro)ect manager and software
architect! $n oth cases# software architecture is a full-time function# with staff
permanently dedicated to it!
;! Designer# who leads the design of a part of the system# within the constraints of the
reGuirements# architecture# and development process for the pro)ect! SJhe identifies
and defines the responsiilities# operations# attriutes# and relationships of design
elements .e!g! Blocks# "ctivity Diagrams# SeGuence Diagrams# Statechart diagrams#
$nterface and Port definition diagrams/! SJhe ensures that the design is consistent with
the architecture! $t is common for a person to act as oth implementer and designer#
taking on the responsiilities of oth roles!
<! Implementer# who is responsile for developing and testing components# in accordance
with the pro)ect_s adopted standards# for integration into larger susystems! @hen test
components# such as drivers or stus# must e created to support testing# the
implementer is also responsile for developing and testing the test components and
corresponding susystems! $t is possile for two persons to act as the implementer for a
9( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

single part of the system# either y dividing responsiilities etween themselves or y
performing tasks together# as in a pair-programming approach!
Further roles may e reGuired for specific activities# including:
$ndependent Guality management!
-onfiguration management!
-hange management!
Process enforcement and monitoring!
"lso# specific roles may e reGuired for specialists and e*perts# e!g! tool specialist# security
e*pert# safety engineer and so on!
'owever# the selected roles are the trade-off etween having many roles and having too many
activities assigned to the same role!
!"#$ii$%$ Mapping 'etween (oles and Activities
2nly a part of e*isting roles is involved in each activity! @hen involved in an activity# a role may
e either primary .i!e!# the role leads the activity and is responsile for decisions/ or secondary
.i!e!# the role is involved in the activity and its presence is mandatory/! Mapping etween roles
and activities is reported in Figure 3<# which further details on the specific tasks each role is
e*pected to perform during each activity are reported in Figure 3<!
/#"%iii% Description of Process $cti*ities
!"#$iii$%$ Anal)*e Stakeholder (e+uests
&he o)ective of the reGuirements analysis phase is to analyEe the process inputs# i!e!#
stakeholder reGuests from systems engineers! For the specific conte*t# as stated in HSoftware
Development Process Enhancement: Baseline DocumentI# input to software engineers is
composed of a sort of functional specification in SysM6# Guite enough incomplete! %on-
functional reGuirements are also provided on the side of functional specification and they are
initially incomplete and informal!
Because of this# the work to e performed on the stakeholder reGuests .i!e! functional and non-
functional specification/ is considerale! $n particular# a step of reGuirements elicitation is
fundamental to come with a set of software reGuirements which may e used y software
engineers! $n order to perform reGuirements elicitation# an overall understanding of the system
is mandatory! $n the aim of having a shared understanding of the system among software team
memers# a gloal analysis of stakeholder reGuests and needs is performed in the form of a
meeting!
9' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 3< Mapping etween roles and activities
92 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n 'armony# a Stakeholder 5eGuirements Document is generated after analyEing and refining
the stakeholder reGuests! $n the conte*t of the $mpact-2ne# it is not necessary to develop an ad
hoc document# as stakeholder reGuests are already formaliEed in the so called Su-System
Specification .SSS/!
&he following roles shall e involved in the activity as primary roles:
Pro)ect manager# who plans the meeting# drives the communication and defines the
scope of the system
Stakeholder# who presents its needs and constraints!
5eGuirements specifier# who elicits needs!
&he following metrics shall e considered:
"verage time spent for analyEing each e*isting stakeholder reGuests during the activity
.reGuires the numer of stakeholder reGuests/!
"verage time spent for modifying each e*isting stakeholder reGuests during the activity
.reGuires the numer of modified stakeholder reGuests/!
"verage time spent for adding each stakeholder reGuests during the activity .reGuires
the numer of stakeholder reGuests efore and after the activity/!
!"#$iii$,$ Define S)stem "alidation Plan
" plan for validating stakeholder reGuests should e defined! "greement upon that plan
etween systems engineers and pro)ect manager is mandatory! Eventually# the validation plan
shall include the detailed description of all test scenarios systems engineers are willing to
e*ecute and their e*pected results! &he population of the validation plan is iterative# ut any
modification to that plan shall go through a change management process! $n fact# changes to
the validation plan may lead to considerale rework on the pro)ect# as they may impact
reGuirements and their understanding!
&he following shall e involved in the activity as primary roles:
Stakeholder# who presents its needs for validation!
Pro)ect manager# who accepts and agrees upon the stakeholder validation plan!
&he following shall e involved in the activity as secondary roles:
5eGuirements Specifier# which participates in the activity for a deeper understanding of
stakeholder needs!
&est Designer# who models test scenarios for validation and create the needed
documentation!
&he following metrics shall e considered:
93 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"verage time spent for defining each validation test case .reGuires the numer of
validation test cases/!
"verage time spent for defining all validation test cases relating to a stakeholder reGuest
.reGuires the average numer of validation test cases per stakeholder reGuest/!
!"#$iii$-$ Specif) (e+uirements and raceabilit) to Stakeholder (e+uests
"fter analyEing stakeholder reGuests# the SystemJSoftware 5eGuirements document .S5S/ shall
e generated! Such document formaliEes functional and non-functional reGuirements in the
form of Hshall statementsI! Each statement descries a single functionality of the systems!
&ogether with S5S# a traceaility matri* maps stakeholder reGuests to system reGuirements#
and a 5eGuirements Diagram models reGuirements in SysM6! Such a diagram helps
understanding relationships among reGuirements .e!g! generaliEation or derivation/# and may
support a well-structured traceaility policy: for e*ample# whenever a verification tool .e!g! &est
-onductor/ does not provide functionality for tracing reGuirements to test cases# SysM6
reGuirements .i!e!# model elements/ may e traced to seGuence diagrams modeling test
scenarios .still model elements/!
&raceaility to stakeholder reGuests# instead# is reGuired to:
3! -heck that each stakeholders+ need is addressed in at least one reGuirement!
9! Each specified reGuirement is motivated y at least one actual need!
>! Support the change management process!
&he following shall e involved in the activity as primary roles:
5eGuirements specifier# who formaliEes functional and non-functional reGuirements!
&he following metrics shall e considered:
"verage time spent for defining each reGuirements during the activity .reGuires the
numer of reGuirements/!
"verage time spent for tracing each reGuirement to stakeholder reGuests .reGuires the
numer of traces from reGuirements to stakeholder reGuests/!
"verage time spent for tracing each stakeholder reGuest to reGuirements .reGuires the
numer of traces from stakeholder reGuests to reGuirements/!
"verage time spent to define all reGuirements deriving from a given stakeholder reGuest
.reGuires the numer of reGuirements and the numer of stakeholder reGuests/!
"verage time spent on the reGuirements diagram per added reGuirement .reGuirements
the numer of reGuirements efore and after ending the activity/!

94 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 80$-34: 5eGuirements Diagram
!"#$iii$.$ (eviews and (isk Anal)sis
" software review is a Hprocess or meeting during which a software product is e*amined y a
pro)ect personnel# managers# users# customers# user representatives# or other interested
parties for comment or approvalI X3AY! Different degrees of formalism for such reviews may e
adopted# depending on company process constraints# pro)ect constraints and needs!
"long the process# at least the following reviews shall e performed within the pro)ect
development:
5eview of reGuirements# traceaility# and risk analysis!
5eview Black Bo* 1-s# traceaility# system test plan# reports# and risk analysis!
5eview @hite Bo* 1-s# integration test plan# reports# and risk analysis!
"dditional reviews shall e planned whereas the company forces their e*ecution at specific
stages of the development! Such additional reviews should not match any of the previously
listed three reviews!
5eviews are to e scheduled y the pro)ect leader! Details for each review are provided in the
following suparagraphs! During each meeting# a role has in charge to lead the review!
9- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n every review# also traceaility and risk analysis shall e discussed and checked# and a specific
suparagraph descries their review!
!"#$iii$/$ (eview of (e+uirements
During a reGuirements review# 5eGuirements Specification -hecklist .see later in this Section/#
Pro)ect -onstraints and Performance -hecklist# and &est -hecklist shall e used!
&he following shall e involved in the activity as primary roles:
Pro)ect manager# who plans the meeting and delierates the meeting end when no
&2D2s or douts on the documents are left!
5eGuirements specifier# who leads the meeting y reading and e*plaining specified
reGuirements and presenting arisen douts# issues and risks!
Stakeholder# who confirms and validates reGuirements!
&he following shall e involved in the activity as secondary roles:
"rchitect# who participates in the meeting and understands architecture constraints!
Designer# who participates in the meeting and understands design constraints!
&he following metrics shall e considered:
"verage time spent for reviewing each reGuirements .reGuires the numer of
reviewed reGuirements/!
!"#$iii$0$ (eview of 'lack 'o& 1se 2ase Design3 S)stem est Plan3 and "erification
(eports
&o review the system plan means to check whether a sufficient numer and types of test cases
have een defined and planned for suseGuent system test activities .not model e*ecution# ut
code e*ecution# possily in the target environment or simulator/! HSufficientI includes at least
two test cases for each function# i!e! an e*pected input and an une*pected input# even though#
in general# coverage criteria shall e defined and satisfied y the plan! @hen such criteria are
defined# review the test plan document means to check that coverage criteria are met!
&o review the verification report means to check whether: .3/ model e*ecution output matches
the e*pected output in the corresponding system test case# and .9/ model e*ecution flow
matches the e*pected flow as specified in reGuirements!
During Black Bo* 1se -ase Design review# checklists 1se -ase Model -hecklist# Software
"rchitecture -hecklist# and &est -hecklist shall e used!
&he following shall e involved in the activity as primary roles:
Pro)ect manager# who plans the meeting and delierates the meeting end when no
&2D2s or douts on the documents are left!
92 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"rchitect# who leads the meeting y reading the Black Bo* 1se -ase Design# and
presenting arisen douts# issues and risks!
&est Designer# who descries the system test plan# i!e! system test cases associated to
Black Bo* 1se -ase Design!
&he following shall e involved in the activity as secondary roles:
5eGuirement specifier# who participates in the review y checking that all reGuirements
are met and constraints satisfied!
Designer# who participates in the meeting and understands design constraints!
&he following metrics shall e considered:
"verage time spent for reviewing each lack o* use case .reGuires the numer of
reviewed lack o* use case/!
"verage time spent for reviewing each diagram of the lack o* use case design
.reGuires the average numer of diagrams in the design/!
"verage time spent for reviewing all diagrams of a given lack o* use case .reGuires the
average numer of diagrams per lack o* use case/!
"verage time spent for reviewing each lock defined for the lack o* use case design
.reGuires the numer of locks in the design/
"verage time spent for reviewing all locks defined for a given lack o* use case
.reGuires the average numer of locks per lack o* use case/
"verage time spent for reviewing each system test case .reGuires the numer of test
cases/!
"verage time spent for reviewing each verification report entry .reGuires the numer
verification report entry/!
!"#$iii$4$ (eview of Architecture3 5hite 'o& 1se 2ase Design3 S)stem #ntegration
est Plan3 and (eports
&o review the system integration plan means to check whether a sufficient numer and types of
test cases have een defined and planned for suseGuent integration test activities .not model
e*ecution# ut code e*ecution# possily in the target environment or simulator/! HSufficientI
includes at least two test cases for each function# i!e! an e*pected input and an une*pected
input# even though# in general# coverage criteria shall e defined and satisfied y the plan!
@hen such criteria are defined# review the test plan document means to check that coverage
criteria are met!
99 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&o review the verification report means to check whether: .3/ model e*ecution output matches
the e*pected output in the corresponding system integration test case# and .9/ model e*ecution
flow matches the e*pected flow as specified in Black Bo* 1se -ase Design!
During @hite Bo* 1se -ase Design review# checklists 1se -ase Model -hecklist# Susystem
Design -hecklist# and &est -hecklist shall e used!
&he following shall e involved in the activity as primary roles:
Pro)ect manager# who plans the meeting and delierates the meeting end when no
&2D2s or douts on the documents are left!
"rchitect# who check whether provided architecture and constraints are met!
Designer# who leads the meeting y reading the @hite Bo* 1se -ase Design# and
presenting arisen douts# issues and risks!
&he following shall e involved in the activity as secondary roles:
&est Designer# who descries the integration test plan# i!e! integration test cases
associated to @hite Bo* 1se -ase Design!
&he following metrics shall e considered:
"verage time spent for reviewing the architecture per architectural lock .reGuires the
numer of architectural locks/
"verage time spent for reviewing each white o* use case .reGuires the numer of
reviewed white o* use case/!
"verage time spent for reviewing each diagram of a white o* use case design .reGuires
the average numer of diagrams for each white o* use case design/!
"verage time spent for reviewing each system integration test case .reGuires the
numer of test cases/!
"verage time spent for reviewing each verification report entry .reGuires the numer
verification report entry/!
!"#$iii$6$ (eview raceabilit) Matrices
5eviewing traceaility means to check some properties for all mappings# i!e!:
Stakeholder reGuests to reGuirements!
5eGuirements .including non functional/ to 1se -ases .lack o* design/!
1se cases .white o*/ to units and to system integration test cases!
1nits to unit test cases!
Properties to check are:
3! "ll higher level artifacts have een traced to at least one lower level artifact!
9! "ll lower level artifacts are traced ack to at least one higher level artifact!
9, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

>! "ll trace links are correct# i!e! the traceaility makes sense for each trace link!
A! "ll trace links are complete# i!e! there is no trace link missing fromJto an artifact!
&he following metrics shall e considered:
"verage time spent for reviewing each trace from the lower level to the higher level
.reGuires the numer of traces from the lower level to the higher level/!
"verage time spent for reviewing each trace from the higher level to the lower level
.reGuires the numer of traces from the higher level to the lower level/!
!"#$iii$7$ Anal)sis of (isks
" numer of aspects and threats should e analyEed during each review! $n case something
goes wrong# it may turn to have a ma)or impact on pro)ect progress# e!g!: much rework to e
performed# the pro)ect to fail# to overrun udget or schedule# or not to meet stakeholder
needs! "spects and threats to e accounted include at least: achievaility and reasonaility of
performance goalsF volatility of reGuirementsF roustness of the architecture with respect to
changeF mortality of resourcesF adeGuate documentation of algorithms and design rationaleF
process assurance!
$n case a structured risk management su-process is part of the company culture# such plan
should e updated and augmented according to review findings!
&he following shall e involved in the activity as primary roles:
Pro)ect manager# who drives the analysis and accept mitigation plans and strategies!
Since all other roles# ut the stakeholder# are involved in at least one of the reviews# they shall
collaorate and contriute# from their specific points of view# to the risk analysis performed
during the meetings they participate in!
&he following metrics shall e considered:
"verage time spent for analyEing each risk .reGuires the numer of analyEed risks/!
!"#$iii$%8$ #dentif) and race 1se-cases to (e+uirements
&he first step is to create a use case model which starts with the identification of actors and use
cases!
"n actor is a model element which represents an entity e*ternal to the system under
development# i!e! an entity whose development is not part of the current pro)ect# ut which
offers and reGuires interfaces for interacting with the system under development!
" use case is a model element which captures a reGuired system ehavior from the perspective
of the end-user! $t is not a function of the system# ut rather a set of related scenarios! "
9: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

scenario is a non-ranching seGuence of interactions etween actors and system under
development! &hus# a use case collects all situations a user can find in when trying to achieve a
high-level goal the system is reGuired to serve!
Both actors and use cases are part of a system use case diagram# which includes: all use cases#
oth the one primary actor .i!e! the actor activating the use case/ and all secondary actors .i!e!
non primary actors involved in the use cases/ for each use case# and all relationships among
actors .i!e! generaliEation/# among use cases .i!e! e*tension and inclusion/# and etween actors
and use cases .i!e! association/!

Figure 37 System use case diagram in 5hapsody
SuseGuently# a traceaility matri* is incrementally uilt up# which maps reGuirements to use
cases! &his traceaility flow down supports the change management process# and allows to
check whether all reGuirements have een allocated to use cases!
,( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 9: &raceaility matri* from 5eGuirements to 1se -ases .Black-Bo*/ in 5hapsody
"lso# allocation of reGuirements can e formaliEed through the use of the SysM6 HallocationI
relationships! Some development tools# including 5hapsody# automatically create an allocation
relationship when the traceaility matri* is populated etween use cases and reGuirements!
&he following shall e involved in the activity as primary roles:
5eGuirements specifier# who performs all tasks!
&he following metrics shall e considered:
"verage time spent for identifying each use case .reGuires the numer of use identified
cases/!
"verage time spent for reviewing each trace from uses cases to reGuirements .reGuires
the numer of traces from uses cases to reGuirements/!
"verage time spent for reviewing each trace from the reGuirements to uses cases
.reGuires the numer of traces from the reGuirements to use cases/!
!"#$iii$%%$ 'lack 'o& 1se 2ase Design
&he design of Black Bo* 1se -ase is the same as in 'armony and may e performed in several
ways# i!e! any possile path in the following activity diagram# taken from the 'armony
Deskook v>!3!9!
,' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 93 'armony Process to design 1se -ases .Black Bo* Design/
&he first step is the definition of the system conte*t in an )nternal -loc2 Diagram! &he elements
of this diagram are instances of SysM6 locks# representing each use case and its associated
actor.s/!

,2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 99 System use case model conte*t in 5hapsody
"t this stage# locks are empty and not linked# and their definition is functional to the model-
ased approach we are willing to apply! "ny path in the activity includes the creation of: one
activity diagram per use caseF a numer of seGuence diagrams .one per scenario/F one
statechart diagram! "lso# any path includes the definition of ports and interfaces in an internal
lock diagram associated to the use case! &he previous representation of use cases as SysM6
lock allows to automatically and immediately link each use case to its internal structure and
ehavior!

Figure 9> "ctivity Diagram in 5hapsody
,3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure 9A SeGuence Diagram in 5hapsody

Figure 9B $nternal Block Diagram in 5hapsody

Figure 9; Statechart diagram in 5hapsody
,4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&he output of the Black Bo* 1se -ase Design is the construction of an e*ecutale model!
&he following shall e involved in the activity as primary roles:
5eGuirements specifier# who performs all tasks!
&he following metrics shall e considered:
"verage time spent for defining each lack o* use case .reGuires the numer of
defined lack o* use case/!
"verage time spent for defining each diagram of the lack o* use case design .reGuires
the average numer of diagrams in the design/!
"verage time spent for defining all diagrams of a given lack o* use case .reGuires the
average numer of diagrams per lack o* use case/!
"verage time spent for defining each lock defined for the lack o* use case design
.reGuires the numer of locks in the design/!
"verage time spent for defining all locks defined for a given lack o* use case
.reGuires the average numer of locks per use case/!
"verage time spent for defining all use cases deriving from a given reGuirement
.reGuires the average numer of use cases deriving from each reGuirement/!
!"#$iii$%,$ Define and race S)stem3 S)stem #ntegration3 and 1nit est Plans
Based on the gathered and formaliEed reGuirements a plan for testing the system shall e
defined!
Based on the specified susystem interfaces# a plan for testing the system integration shall e
defined!
Based on the developed units# a plan testing each unit shall e defined!
"ll plans include all details reGuired to run a test case and check whether the actual output
matches the e*pected output# and whether the actual ehavior matches the e*pected
ehavior! "n appropriate coverage of system cases over system reGuirements .and of systems
integration test over interface reGuirements# and of unit test over unit reGuirements/ is
e*pected to e achieved through system and system integration test! -riteria for formaliEing
the term Happropriate coverageI and making it o)ective are to e defined in advance# also
according to system non-functional reGuirements .e!g! reliaility# safety or performance/! "lso
traceaility matrices are reGuired: one should trace each reGuirement to a set of system test
cases constituting the system test planF another one should trace each interface reGuirements
.i!e! each operation/ to a set of system integration test casesF one last should trace each unit
reGuirements .i!e! each class method/ to a set of unit test cases!!
&he following shall e involved in the activity as primary roles:
,- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&est Designer# who defines# details and models system test cases associated to the Black
Bo* 1se -ase Design and system integration test cases associated to the @hite Bo* 1se
-ase Design! SJhe is also responsile for generate the reGuired documents!
5eGuirements specifier .only for system test plan/# who assures that coverage criteria
are met y the system test plan for verifying the correct system functionality!
"rchitect and Designer .only for system integration test plan/# who assure that coverage
criteria are met y the system integration test plan for verifying the correct interaction
among susystems! Both "rchitect and Designer roles are reGuired# ecause they are
oth involved in the @hite Bo* 1se -ase Design# even though they operate at different
levels of astraction .higher for the first role# lower for the last role/!
$mplementer .only for unit test plan/# who assures that coverage criteria are met y the
unit test plan for verifying the correct unit functionality!
&he following metrics shall e considered:
"verage time spent for defining each system test case .reGuires the numer of system
test cases/!
"verage time spent for defining all system test cases for a given reGuirement .reGuires
the average numer of system test cases per reGuirement/!
"verage time spent for defining each system integration test case .reGuires the numer
of system integration test cases/!
"verage time spent for defining all system integration test cases for a given
communication link etween locks .reGuires the average numer of system test cases
per communication link etween locks/!
"verage time spent for defining each system test case .reGuires the numer of unit test
cases/!
"verage time spent for defining all unit test cases for a given unit .reGuires the average
numer of unit test cases per unit/!
!"#$iii$%-$ "erif) 'lack-'o& and 5hite 'o& 1se 2ase Design through E&ecution
&he use case model shall e analyEed through model e*ecution! &he primary focus of this form
of verification is on checking the generated seGuences of messages in response to the given
stimuli# rather than on testing the underlying functionality! &his last aspect# in fact# will e
checked later in the development# as the focus is on verifying the correct interaction etween
the system under development and the environment!
$n order to e*ecute the 1se -ase Model# e*ecution scenarios shall e defined and ran# e!g!
through 5hapsody animations! &his way# for a given use case and a given scenario# the output
,2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

of the e*ecution of the use case model can e matched to the e*pected lack o* system
ehavior definition .i!e! the specified ehavior/!

Figure 9< Scenario .SeGuence Diagram/ compare in 5hapsody
Detected prolems# e!g! errors# incompleteness# or inconsistencies concerning reGuirements
shall e addressed and fi*ed!
&he following shall e involved in the activity as primary roles:
5eGuirements specifier# who e*ecutes the model!
&he following metrics shall e considered:
"verage time spent for defining each e*ecutale models .reGuires the numer of
defined e*ecutale models/
"verage time spent for defining all e*ecutale models for a given use case .reGuires the
average numer of e*ecutale models per o* use/!
,9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

!"#$iii$%.$ Define S)stem Architecture
Defining the system architecture# in 'armony words# Hmeans to determine the est means of
achieving the capaility of a particular function in a rational mannerI! " way for doing that is
applying the weighted o)ectives method .-ross 3747/# which allows to potential solutions to
functional prolems! Such approach includes a seGuence of steps# i!e!:
$dentity key system functions# i!e! sets of related activities defined in the Black Bo* 1se
-ase model which may e realiEed y a single architectural component# e!g! H-apture
Biometric DataI may group the activities of Hscanning iometrics dataI# Hauthenticate
iometric dataI# and Hdisplay authentication statusI# which may e a su-set of the
activities of the use case H-ontrol entry to the systemI!
Define a set of candidate solutions for each key system function# e!g! HFingerprint
scannerI and H2ptical scannerI!
$dentify assessment .solution/ criteria for each key system function!

Figure 94 E*ample of assessment criteria
"ssign weights to assessment criteria# according to their relative importance!

Figure 97 E*ample of weight assignment to assessment criteria
,, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Define utility curves for each assessment criterion# i!e! curves where the * a*is is the
e*tent to which an assessment criterion may e met# and the y a*is is a measure of
effectiveness for that criterion! Some criteria# in fact# have a constantly increasing utility
function# i!e! the more# the etter .e!g! reliaility/# while other criteria may have a step
function!

Figure >: E*ample of 1tility -urve Shapes

Figure >3 E*ample of cost utility function
"ssign measures of effectiveness to candidate solutions for each criterion! &he measure
of effectiveness is relative value with respect to a nominal value of the corresponding
utility curve!

Figure >9 E*ample of assignment of measures of effectiveness
,: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Determine solution y weighted o)ectives calculation: for each assessment criterion#
multiply the measure of effectiveness and its weight# so getting its the weighted value!
"dd up weighted values for all assessment criteria to produce a score for that candidate
solution! &he selected solution is the one with the highest score!

Figure >> E*ample of solution determination
Merge possile solutions to form system architecture! $n fact# the previous study on
utility curves# assessment criteria# measures of effectiveness and weighted o)ectives
calculation are to e performed for each key system function! &hus# each key system
function will generate an architectural solution# which needs to e merged with all
architectures generated y analyEing other key system functions! &he system
architecture is the result of merging such key system function architectures!
&he following shall e involved in the activity as primary roles:
"rchitect# who makes decisions regarding the architecture# perform architecture
analysis .e!g! application of the weighted o)ectives method/# y taking into account
oth functional and non-functional reGuirements!
&he following shall e involved in the activity as secondary roles:
Designer# who collaorates to the definition of the architecture y providing feedack
on the feasiility of the architecture itself!
&he following metrics shall e considered:
"verage time spent for defining each architectural element .reGuires the numer of
defined architectural element/
"verage time spent for defining all communication links for a given architectural
element .reGuires the average numer of communication links per architectural
element/!
"verage time to generate all architectural locks derived from a given use case .reGuires
the numer of lack o* use cases/!
:( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

!"#$iii$%/$ 5hite 'o& 1se 2ase Design
Based on the system architecture concept# the system under development shall e
decomposed into parts! Such decomposition shall e documented in Block Definition Diagrams
and $nternal Block Diagrams and may e performed in a numer of suseGuent steps# i!e! at
gradually lower levels of astraction! Each lack o* activity diagram shall e transformed into
a white o* one y creating swim lanes# each representing a part of the system# and allocating
activities to those lanes!

Figure >A Passing from Black Bo* "ctivity Diagram .left/ to @hite Bo* "ctivity Diagram .right/ X3>Y
$f an activity cannot e allocated to a single lock# then it needs to e roken down into at least
two activities!
,iven white o* activity diagrams# white o* seGuence diagram shall e derived! $n such
diagrams# interactions among susystems are modeled! 6ifelines in white o* seGuence
diagrams correspond to locks representing susystems .or# at a lower level# to parts elonging
to susystem locks/! @hen decomposing a lock into parts# lock operations shall e
partitioned among parts and possily decomposed into more operations .in case a single
operation cannot e allocated to one part/! $n this last case# the creation of a dependency
among lower level operations and higher level operation is reGuired to keep trace of that
decomposition! @hen an operation does not need to e roken down into more operations# it
can e )ust copied to the part in which it has to e allocated!
:' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Ports and interface shall also e defined# which is made easier when modeling susystem
interaction y means of white o* seGuence diagrams# where each message or service
represents an interface reGuirement! 2nce ports and interfaces are defined# the state-ase
susystem ehaviors are captured into statechart diagrams!
%otice that no e*plicit traceaility is necessary during this activity! $n fact# a one-to-one
mapping is planned etween use cases identified during Black Bo* 1se -ase Design and the use
cases to e detailed during @hite Bo* 1se -ase Design! 'owever# during review activities# at
least completeness shall e verified# i!e! every use case identified during Black Bo* 1se -ase
Design has een considered during @hite Bo* 1se -ase Design!
During decomposition# architectural and design patterns may e leveraged! For e*ample#
Boundary--ontrol-Entity .Eclipse 9:3:/ pattern may contriute to define fle*ile# reusale#
easy-to-understand and software engineered susystems!
"n entity element is responsile for some meaningful chunk of information! &his does not imply
that entities are .or store/ data! Entities perform meaningful ehavior around some cohesive
amount of data! $f they do not# then they are )ust HeansI# not HentitiesI! "n entity interacts
with other entities or with control elements!
" control element manages the flow of interaction of the scenario! " control element could
manage the end-to-end ehavior of a scenario or it could manage the interactions etween a
suset of the elements! " control element interacts with any kind of elements .i!e! oundary#
entity or other control/! Behavior and usiness rules relating to the information relevant to the
scenario should e assigned to the entitiesF the control elements are responsile only for the
flow of the scenario!
" oundary element lies on the periphery of a system or susystem# ut within it .as opposite
to actors# which are outside/! For any scenario eing considered# either across the whole
system or within a susystem# some oundary elements will e the Horder elementsI that
accept input from outside# and provide output to e*ternal elements# according to the interface
specification of the system or susystem they elong to! Sometimes oundary elements match
,raphical 1ser $nterface elements# ut this is not a rule! " oundary element interacts with a
control element or those oundary elements strictly correlated to it# i!e! not with those
oundary elements managing different scenarios .e!g! interface with other susystems or a
different user interface/!
&he following shall e involved in the activity as primary roles:
Designer# who defines susystems internal structure and ehavior# allocate operations#
and decompose system locks!
:2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&he following shall e involved in the activity as secondary roles:
"rchitect# who ensures that architecture is understood and respected!
&he following metrics shall e considered:
"verage time spent for defining each white o* use case .reGuires the numer of
defined white o* use case/!
"verage time spent for defining each diagram of the white o* use case design .reGuires
the average numer of diagrams in the design/!
"verage time spent for defining all diagrams of a given white o* use case .reGuires the
average numer of diagrams per lack o* use case/!
"verage time spent for defining each lock defined for the white o* use case design
.reGuires the numer of locks in the design/
"verage time spent for defining all locks defined for a given white o* use case
.reGuires the average numer of locks per use case/
!"#$iii$%0$ Develop or 9enerate 1nits and race to 5hite 'o& 1se 2ase Design
$n a model-ased approach# it is possile to generate code from design models! Specifically# it
should e possile to generate source code from 1M6 classes! @hen source code cannot e
completely generated# a hyrid approach is also allowed# where code is partially generated and
then manual coding is reGuired to completely implement class ehaviors! $n general# to develop
a unit may also e seen as a task of high level! For e*ample a unit may e a system in a system-
of-systems! $n this case# after applying all process activities at system-of-systems-level#
development of a unit may e demanded to a system development team! @hite Bo* 1se -ase
Design# in this case# represents the set of stakeholder reGuests to provide to the system
development team! &hus# that team may start its development process# possily applying the
same process# ut a lower level of astraction and comple*ity!
'owever# in oth situations# i!e! code to e written or generated# and a system to e
developed# each unit will need to trace some design elements! Specifically# each unit shall have
at least one use case to refer to! -onversely# each lock or part of the @hite Bo* 1se -ase
Design shall e implemented y at least one unit!
&he following shall e involved in the activity as primary roles:
$mplementer# who writes or generates unit code and trace units to model elements of
the @hite Bo* 1se -ase Design!
&he following metrics shall e considered:
"verage time spent for generating each leaf unit .i!e! code/ .reGuires the numer of
generated units/!
:3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"verage time reGuired to generate all units deriving from a given use case .reGuires the
average numer of units per use case/!
!"#$iii$%4$ Perform 1nit est3 S)stem #ntegration est3 and S)stem est
Plans for unit test# system integration test# and system test already e*ist at this stage of the
process# as they were previously defined! &he test case e*ecution can e automatic# if a test
environment .partially/ support that# or manual! $n the last case# drivers and stu may e
reGuired!
@hen a ug is detected# a correction action shall e identified and their impact on e*isting
system estimated! $f reGuired# a prolem report may e generated and provided as input to the
change management process! $f the ug correction does not impact any other part of the
system# it can e turned into practice and the test is run one more time# possily with
leveraging regression testing!
&he following shall e involved in the activity as primary roles:
&est Designer# who e*ecutes test cases# matches actual output against e*pected output
and documents results!
&he following shall e involved in the activity as secondary roles:
$mplementer .only for unit test/# who provides support to unit test e*ecution and
analysis!
"rchitect and Designer .only for system integration test/# who provide support to system
integration test e*ecution and analysis!
5eGuirements specifier .only for system test/# who provides support to system test
e*ecution and analysis!
Pro)ect Manager .only for system test/# who delierates when the system meets all
reGuirements and is ready to go to validation!
&he following metrics shall e considered:
"verage time spent for running all test cases for a given unitJcommunication
linkJreGuirements .reGuires the numer of test cases/!
"verage time spent for fi*ing a ug detected during unitJsystem integrationJsystem test
.reGuires the numer of detected ugs/!
!"#$iii$%6$ Perform S)stem "alidation
System validation is a formal meeting in which the stakeholder receives a work product that is
e*pected to satisfy its needs! During validation# stakeholders ase on the system validation plan
:4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

to run scenarios! &he correct e*ecution of each of those scenarios is mandatory to finaliEe
pro)ect development!
&he following shall e involved in the activity as primary roles:
Pro)ect Manager# who leads the activity and shows how the product fulfills stakeholder
reGuirements!
Stakeholder# who checks whether the product meets its needs and what planned in the
system validation plan!
&he following metrics shall e considered:
"verage time spent for running all validation test cases for a given stakeholder reGuest
.reGuires the numer of validation test cases/!
"verage time spent for fi*ing a ug detected during validation .reGuires the numer of
detected ugs/!
!"#$iii$%7$ Further Activities
Most pro)ect management activities are hard to fit into specific steps of the proposed
development process# as most of them are asynchronous or constantly ongoing during pro)ect
development! Such activities include at least:
3! assign roles to resourcesF
9! plan# manage and allocate rolesJresources to tasksF
>! shape prioritiesF
A! coordinate interactions with customers and usersF
B! select tools to useF
;! estalish formal documents to generateF
<! delierate achievement of a goal
4! delierate accomplishment of a taskF
7! schedule meetings# peer reviews and milestone reviews as appropriate!
/#"%i*% $rtifacts and Tools
During process e*ecution# a numer of artifacts shall e generated# including te*t documents#
SysM6 and 1M6 documents# and test cases! Furthermore# different development environments
may e used to support the creation of such artifacts! Specifically# > kinds of tools were
identified: 5eGuirements Management &ool# Model-Based Development &ool# and 0erification
Environment!
Figure >B traces: .a/ artifacts to process phase in which it is generatedF ./ generated artifacts to
su-folders .or eGuivalent structures/ in which they should e stored within the development
environmentsF .c/ each artifact to the kind of tool e*pected to manage it!
:- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure >B Mapping etween artifacts and tools
:2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

!"#$iv$%$ 2hecklists: a Support ool for (eviews
During reviews .e!g! 5eGuirements# &est Plan or Design reviews/ the use of checklists may
support the verification task addressing those system aspects which freGuently represent
sources of prolems or malfunctioning! $n the ne*t paragraphs some checklist mainly coming
from 51P are presented and suggested for use during process reviews!
(e+uirements Specification 2hecklist
-heck whether the S5S e*hiit fundamental characteristics# including:
3! -orrect: $s every reGuirement stated in the S5S one that the software should meetL
9! 1namiguous: Does each reGuirement have one# and only one# interpretationL 'as the
customer_s language een usedL 'ave diagrams een used to augment the natural
language descriptionsL
>! -omplete: Does the S5S include all significant reGuirements# whether related to
functionality# performance design constraints# attriutes# or e*ternal interfacesL 'ave
the e*pected ranges of input values in all possile scenarios een identified and
addressedL 'ave responses een included to oth valid and invalid input valuesL Do all
figures# tales and diagrams include full laels and references and definitions of all
terms and units of measureL 'ave all &BDs een resolved or addressedL For each error
or e*ception# a policy defines how the system is restored to a ^normal^ state! For each
possile type of input error from the user or wrong data from e*ternal systems# a policy
defines how the system is restored to a ^normal^ state!
A! -onsistent: Does this S5S agree with the 0ision document# the use-case model and the
Supplementary SpecificationsL Does it agree with any other higher level specificationsL
$s it internally consistent# with no suset of individual reGuirements descried in it in
conflictL
B! "ility to 5ank 5eGuirements: 'as each reGuirement een tagged with an identifier to
indicate either the importance or staility of that particular reGuirementL 'ave other
significant attriutes for properly determining priority een identifiedL
;! 0erifiale: $s every reGuirement stated in the S5S verifialeL Does there e*ist some finite
cost-effective process with which a person or machine can check that the software
product meets the reGuirementL
<! Modifiale: "re the structure and style of the S5S such that any changes to the
reGuirements can e made easily# completely# and consistently while retaining the
structure and styleL 'as redundancy een identified# minimiEed and cross-referencedL
4! &raceale: Does each reGuirement have a clear identifierL $s the origin of each
reGuirement clearL $s ackward traceaility maintained y e*plicitly referencing earlier
:9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

artifactsL $s a reasonale amount of forward traceaility maintained to artifacts
spawned y the S5SL
Pro:ect 2onstraints and Performance 2hecklist
3! 'ave all reGuirements derived from e*isting standard and regulations een specifiedL
'ow will this e tracedL
9! "re there any reGuired standards in effect# implementation language# policies for
dataase integrity# resource limits# operating environment.s/# etc!L
>! @hat is the software supposed to doL &his should include the following:
a! 0alidity checks on the inputs
! ,eneral responses to anormal situations# including: overflow# communication
facilities# error handling and recovery
c! Effects of parameters
d! 5elationship of outputs to inputs# including inputJoutput seGuences and
formulas for input to output conversion
A! @hat is the speed# availaility# response time# recovery time of various software
functions# etc!L "re oth static and dynamic reGuirements includedL
B! @hat are the reliaility# availaility# portaility# correctness# maintainaility# security#
etc! considerationsL
;! "ll nominal and ma*imal performance reGuirements are specified!
<! Performance reGuirements are reasonale and reflect real constraints in the prolem
domain!
4! &he workload analysis model provides estimates of system performance that indicate
which performance reGuirements# if any# are risks!
7! Bottleneck o)ects have een identified and strategies defined to avoid performance
ottlenecks!
3:! E*pand -ollaoration message counts are appropriate given the prolem domain!
33! E*ecutale start-up .initialiEation/ is within acceptale limits as defined y the
reGuirements!
1se 2ase Model 2hecklist
3! 'ave you found all the actorsL
9! $s each actor involved with at least one use caseL
>! "ll functional reGuirements are mapped to at least one use case!
A! "ll non-functional reGuirements that must e satisfied y specific use cases have een
mapped to those use cases!
Software Architecture 2hecklist
3! &he system has a single consistent# coherent architecture!
:, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

9! &he numer and types of component is reasonale!
>! &he system will meet its availaility targets!
A! &he architecture will permit the system to e recovered in the event of a failure
within the reGuired amount of time!
B! &he architecture provides defines clear interfaces to enale partitioning for parallel
team development!
;! System performance estimates have een validated using architectural prototypes#
especially for performance-critical reGuirements!
<! &he architecture provides for recovery in the event of disaster or system failure!
Subs)stem Design 2hecklist
3! &he name of each susystem is uniGue and descriptive of the collective responsiilities
of the susystem!
9! &he susystem description accurately reflects the collective responsiilities of the
susystem!
>! &he susystem# through its interfaces# presents a single# logically consistent set of
services!
A! &he susystem is the responsiility of a single individual or team!
B! &he susystem realiEes at least one interface!
;! &he interfaces realiEed y the susystem are clearly identified and the dependencies are
correctly documented!
<! &he susystem_s dependencies on other model elements are restricted to interfaces and
packages to which the susystem has a compilation dependency!
4! &he information needed to effectively use the susystem is documented in the
susystem facade!
7! 2ther than the interfaces realiEed y the susystem# the susystem_s contents are
completely encapsulated!
3:! Each operation on an interface realiEed y the susystem is utiliEed in some
collaoration!
33! Each operation on an interface realiEed y the susystem is realiEed y a model element
.or a collaoration of model elements/ within the susystem!
39! Susystem partitioning done in a logically consistent way across the entire model!
3>! &he contents of the susystem are fully encapsulated ehind its interfaces!
est 2hecklist
3! Each test case states the e*pected result and method of evaluating the result!
9! &est cases have een identified to e*ecute all product reGuirement ehaviors in the
target-of-test!
:: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

>! For each reGuirement for test# at least two test cases have een identified!
A! Each pro)ect reGuirement .as stated in use cases or the supplemental specifications/ has
at least one associated reGuirement for test or a statement )ustifying why it is not a
reGuirement for test!
B! " clear and concise test strategy is documented for each type of test to e implemented
and e*ecuted! For each test strategy# the following information has een clearly stated:
.3/ &he name of the test and its o)ectiveF .9/ " description of how the test will e
implemented and e*ecutedF .>/ " description of the metrics# measurement methods#
and criteria to e used evaluate the Guality of the target-of-test and completion of the
test!
;! 'ave sufficient tests een implemented to achieve acceptale test coverageL
!"#$iv$,$ 2ontribution of Agile Models to the Proposed Development Process
"n aspect to consider when developing systems is that# rather than depending on a contract or
a statement of work# the customer could work closely with the development team# providing
freGuent feedack! $n our specific conte*t# customers are represented y systems engineers#
who ask for a susystem to e developed! $nteraction etween systems and software engineers
may represent the key to the pro)ect success! Moreover# direct communication among team
memers should e freGuent enough to share knowledge on the full software system and
report encountered issues!
2ne more remarkale point is that effectively responding to changes reGuires to make sure that
planning is fle*ile and ready to adapt to those changes! Furthermore# when a change arrives#
the development team should come up with a reliale impact analysis# which may e provided
only in presence of a clear traceaility form stakeholder reGuirements to code artifacts# passing
y all design and intermediate artifacts! %ot only change management# ut# more in general# a
well-structured configuration management leads to a traceale# manageale and predictale
evolution of the pro)ect! &he definition of a well structure configuration management process is
scheduled in ne*t iterations of the enhancement process!
/#"%*% Re8irements Engineering Discipline Proposal
So far we have descried the process activities for our tailored 'armony Process and provided a
detailed description concerning the metrics to target our goal: pro)ect time control! 'owever#
that goal is not the only aspect to account forF as an e*ample# in this paragraph# we provide
additional metrics for reGuirements-related activities .i!e! 5eGuirements Engineering Discipline/!


'(( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

1ualitF AttriDute efinition Metri= "otes
1namiguous "n S5S is 1 iff
every r in 5 has one
interpretation!
u =
n
u
n

M percentage of
reGuirements with uniGue
interpretation given y
reviewers during a
review!
n
ui
M numer of reGuirements
with uniGue interpretation given
y reviewers during a review!
"miguity is function of each
reader+s ackground!
-omplete "n S5S is - iff every
known reGuirement
is documented!
C =
n
r

n
A
+n
B
+n
C
+n
D
M
percentage of
reGuirements well
documented in the S5S!
n
"
M numer of known and
captured reGuirements!
n
B
M numer of reGuirements
poorly specified# astractly
stated# or not yet validated!
n
-
M numer of known and not
yet specified reGuirements!
n
D
M numer of reGuirements we
do not understand enough to
specify them!
-orrect "n S5S is -t iff
every reGuirement
represents
something which
fulfils stakeholder
needs!
Ct =
n
c
n

=
n
c
n
C
+n
I
M
percentage of correct
reGuirements!

Ct =
n
c
n

=
n
c
n
C
+n
NV
M
percentage of validated
reGuirements!
n
-
M numer of correct
reGuirements!
n
$
M numer of incorrect
reGuirements!
n
$
needs to e estimated!

n
%0
M numer of not validated
reGuirements!
" reGuirement is validated y the
customer!
1nderstandale "n S5S is 1n iff all
classes of S5S
readers can easily
comprehend the
meaning of all
un =
n
UR
n

M percentage of
understood
reGuirements!
n
15
M numer of understood
reGuirements
" reGuirement is understood if a
reader reads and correctly
comments on it to the S5S
'(' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

reGuirements with
a minimum of
e*planation!
authors!
5eaders include customers#
users# pro)ect managers#
software developers# and testers!
-onsistent "n S5S is -o iff and
only if no
reGuirement stated
therein conflicts
with any pro)ect
documentation!
Co =
n
C
n

=
n
C
n
C
+n
I
M
percentage of
reGuirements that are
consistent with all other
documents!
n
$
M numer of reGuirements
which are inconsistent with any
documents# e!g! interface
reGuirements of other systems#
system reGuirements
specification and software
development plan!
n
-
M numer of consistent
reGuirements!
E*ecutale J
$nterpretale J
Prototypale
"n S5S is E$P iff and
only if there e*ists
a software tool
capale of
inputting the S5S
and providing a
dynamic ehavioral
model!
EIP =
n
EIP
n
r
M percentage
of e*ecutale#
interpretale or
prototypale
reGuirements!

n
LIP
M numer of e*ecutale#
interpretale or prototypale
reGuirements!
Precise "n S5S is P iff .a/
numeric Guantities
are used whenever
possile# and ./
the appropriate
levels of precision
are used for all
numeric Guantities!
R =
n
p
n
r
M percentage of
precise reGuirements!
n
p
M numer of precise
reGuirements!

"dditionally# the following metrics shall e collected for pro)ect progress monitoring:
Missing oundaries: ased on the B-E architectural pattern# every actor communicates
with the system through oundary class! &he numer of specified missing defined
'(2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

oundaries provides a rough estimate if the user interface reGuirements that has not
yet een defined in the model!
-oncrete use case not defined: numer of concrete use cases with no diagrams as
description!
'idden model elements: numer of model elements not appearing in any diagrams!
%umer of reGuirements not traced to use cases!
/#"%*i% $nal'sis and Packaging of Reslts
&he methodology is made availale in the form of te*t descriptions and pictures# while
5eGuirements metrics are in the form of a tale! &he remaining work products are made
availale in the form of oth te*t description and $BM 5ational &eam -oncert files# produced y
using $BM 5ational Method -omposer# i!e! multimedia rowsale '&M6 files! Specifically# such
work products are:
3! Software Development Process Model!
a! Flow diagram representing the seGuence of activities!
! Description of activities!
c! Description of roles!
d! Mapping etween roles and activities!
e! $dentification of artifacts and support tools!
f! Mapping etween activities# artifacts and tools!
9! 5eGuirements Engineering Discipline!
>! &raceaility Discipline!
A! 0erification and 0alidation Milestones!
Snapshots from the '&M6 documentation are provided in the following!
'(3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure >; @ork Breakdown Structure and additional information aout the process
'(4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure >< @orkflow of activities of a given phase
'(- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure >4 "ctivity details

Figure >7 5oles elonging to the defined role set
'(2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure A: Details and description of a role

Figure A3 Details of an artifact
'(9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Chapter XVII. Testing $cti*ities: $ Complete E&ample of
$cti*it' Characteri7ation
So# we now have an overview of the process and some documentation! 'owever# there is
something eyond that description# which cannot actually e modeled easily and completely in
the current version of the $BM 5ational product we mentioned: the attriutes of the impact
vectors# which represent the performance of the development activities! &he definition of such
attriutes is not immediate and effortless# ut it reGuires some work we want to make e*plicit
at least for some of those activities! $n the rest of this chapter# we report on the careful
literature review we performed for testing# including how we carried on the survey and some of
its findings! "fterwards# we will end up having a remarkale set of testing attriutes and
empirical data that allow making decisions on testing!
2ver the past decades# hundreds of empirical studies on testing have een performed! Most
studies compare a new proposed techniGue against some others .e!g! against random testing or
against similar techniGues/# while some investigate and assess e*isting techniGues! &he way
e*perimental comparisons are carried on# not only for testing# has already een o)ect of study
recently and recommendations were provided X3B# 3;# 3<# 34# 37# 9:Y! &herefore# we decided to
carry on an e*tensive survey of the literature over the last decade in order to identify factors of
testing that have a detectale impact on testing performance .in the most general
interpretation of this term/! &he goal of our survey is to define impact vectors# i!e! a
homogeneous representation of testing techniGues that can enale their comparison and# most
importantly# their selection and comination in a practical conte*t!
During our survey# we analyEed more than 9::: papers pulished etween 9::A and 9:3> and
identified over 9:: studies reporting results from the empirical investigation of doEens of
testing techniGues! $n order to produce knowledge useful for the $mpact 0ector Space
construction# we created a tree composed of 379 nodes that characteriEe a numer of points of
interest! E*amples of nodes high in the tree hierarchy include: resources .e!g! cost and people/#
performance .e!g! effectiveness and machine time/# technology .e!g! test case generation
approach and level of automation of test case generation/# and domain .e!g! real time systems
and dataase/! &he leaves of this tree represent the dimensions of the impact vectors# while the
non-leaf nodes support organiEing dimensions in orthogonal categories .see Figure A9 and
Figure A>/F each leaf has its own scale .ratio# ordinal or nominal# depending on the nature of the
attriute/! Some of those techniGues can e assigned a priori# as they serve as description of
the techniGue .e!g! is the techniGue randomL $s it classifiale as model-asedL $s it coverage
asedL/# ut some other characteristics must e derived from e*isting empirical studies!
'(, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Each testing techniGue is represented as a single impact vector# whose dimension values are
assigned ased on the data coming from the empirical studies performed on those techniGues!
2viously# no study e*plores all the doEens of factors we identified .nor it reasonaly could/!
'owever# most studies lock a remarkale numer of those factorsF e!g! the domain is pretty
much fi*ed in every study and can e easily inferred# once defined an adeGuate scale .nominal/
for that attriute! Some other factors will remain unfilled for a techniGue# ecause noody has
studied them yetF this will e a gap in the repository of impact vectors concerning that test
techniGue# which future studies will e ale to fill!

Figure A9 E*cerpt of tree: ^&echnology^ su-tree

Figure A> E*cerpt of tree: ^Performance^ su-tree
'(: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

2ne of our findings# however# is that it is not useful to compare asolute values among
different studiesF e!g! if a techniGue &3 detects B:V of defects# while another techniGue &9# in a
different study# performs 7:V# it does not necessarily mean that &9 is etter than &3# e!g!
ecause they were e*perimented on artifacts of remarkaly different Guality! -onseGuently#
our approach was to consider asolute numers only in within the same studyF to clarify# if &3
and &9 are techniGues investigated within the same study and the percentages of removed
defects are B:V and 7:V# then &9 results etter than &3 and we store this information into a
dimension of the impact vector .the scale for such dimension is ordinal and composed of three
values: &3 etter than &9# &9 etter than &3# and &3 eGuals to &9/! &he relative goodness of a
techniGue with respect to another is the information stored in the impact vector! "s a
conseGuence# the numer of dimensions is very high: one dimension e*ists for each possile
non ordered pair of testing techniGues that were e*perimentally investigated!
2viously# more than one study can compare &3 and &9# ut we want to record a single score
for each! $n order to do that# we identified a numer of characteristics X3<# 34Y# an analytical
formula to come up with a single-study score# and one more analytical formula that aggregates
single-study scores to otain a gloal score representing the value of the dimension
corresponding to the pair of testing techniGues under investigation!
$n the following tales# we report the list of characteristics of the empirical study that we
account for the definition of the single-study score# along with characteristic definitions and
additional information! %otice that such characteristics do not match the attriutes of the
impact vectors .i!e!# the dimension of the impact vector space/# ut they are only the means
through which we valoriEe the impact vector dimensions .i!e! one per pair of testing
techniGues/!
&ale 9 -haracteristics of the empirical studies to assign a single-study score
"haracteristic Definition
5eference $D
$dentification of the e*perimental study material and results
&echniGue &3
" techniGue value from the mapping tale elow
&echniGue &9
Slight variation of
&3
$f .&3M&9/ then .slight variation 3 or slight variation 9/ M .yes or tool/
Slight variation of
&9
Family of &3
" family value from the mapping tale elow
Family of &9
EffortJ-ost of
resources 5esult of the comparison .i!e! &3T&9# &3S&9 or &3M&9/
Machine
E*ecution &ime 5esult of the comparison .i!e! &3T&9# &3S&9 or &3M&9/
''( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Defect Detection
-apaility 5esult of the comparison .i!e! &3T&9# &3S&9 or &3M&9/
-omplementary
defect types
$f TB:V defects found are different then high! $f 9:-B:V then partial! Else noJlow! $n
dout etween 9 ranges# e conservative .lower range/! $f no characteriEation of
defects# then unkown
Study maturity 6aoratory study# Formal analysis# 6aoratory replication or Field study .for further
details# see Zuristo\al# ^5eviewing 9B years of testing e*periments^/
5epresentativity
of o)ects
6ow if toy# high if industrial! $f mi* of o)ects# then if T;:V industrial then high# else
low! $f no characteriEation of o)ects# then unknown! $f o)ects under study elongs
to a uniGue category# then the field notes has to e filled
5epresentativity
of su)ects
6ow if students are not characteriEed as high skilled# high if professionals or selected
skilled students! $f mi* of su)ects# then if T;:V professionals then high# else low! $f
no characteriEation of su)ects# then unknown!
5epresentativity
of faults
6ow if faults are seeded# high if real faults! $f mi* of defects# then if T;:V
professionals then high# else low! $f no characteriEation of faults# then unknown! $f
faults addressed are of a particular type# tehn the field notes has to e filled!
&ype of metric for
effortJcost Direct if: person-hour of similar# U or similar! Else indirect .e!g! calendar time/
&ype of metric for
machine
e*ecution time
Direct if: seconds on a given machine# numer of instructions# numer of
operations! Else indirect .e!g! numer of test cases/
&ype of metric for
Defect Detection
-apaility Direct if: foundJmissed defects! Else indirect .e!g! coverage# killed mutants/
%otes
2ptional .e*cept when representativity of o)ects or faults is different from high#
low or unknown/



''' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&ale > Mapping tale techniGue-family
Family Technique
randomly generated test cases
pure random
manual random
feedack directed
functional .lack-o*/ testing
oundary value analysis
control-flow .white-o*/ testing
code .statement/ coverage
state chart coverage
decision .ranch/ coverage
condition coverage
data-flow testing
all-c-uses
all-p-uses
all-uses
all-edges
all-dus
mutation testing
strong .standard/ mutation
3:V mutation .constrained/
asJror mutation .constrained/
st-weakJ3
e*weakJ3
-weakJ3
-weakJn
regression testing
random
retest-all .traditional/
De)a-0u .safe/
&est tue .safe/
modification-ased J te*tual differencing
dataflowJcoverage-ased
improvement testing
minimiEation
selection
augmentation
prioritiEation
model-ased testing model-ased testing
concolic testing concolic testing
e*ploratory testing e*ploratory testing
search-asedJgenetic algorithm testing search-asedJgenetic algorithm testing

''2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

,iven two techniGues &3 and &9 investigated in a study# Single Study Score .SSS/ is computed
with the following formula of our definition: SSS.&3#&9/MMW5WS2# where M is the maturity# 5 is
the study relevance# and S2 is the study output!
M is :!B for laoratory studies# :!; when the laoratory study includes a formal .statistical/
analysis# :!<B for laoratory replication# and 3 for field studies! See the mentioned H5eviewing 9B
years of testing e*perimentsI y Zuristo \ al! for further details!
&he value of 5 is 3!B when the study is of high relevance# 3 for medium relevance# and :!B for low
relevance! &he classification etween low# medium and high relevance is made according to the
following rules# ased on the e*periment characteristics reported in &ale A: relevance is high if all
applicale characteristics are highJrealJdirectF relevance is medium if the performance measures are
direct and at least ;:V of applicale characteristics is highJrealF relevance is low in all other cases!
&ale A E*periment characteristics
A==ounte>
=hara=teristi=s Sample siEe
Su)ects_ average
e*perience "rtifact type Faults
Performance
measures
S=ale <o. ?P16
"u#Fect"@ or
high?QR16
"u#Fect"@
low.S> years/ or
high .TM> years/
!rtificial or
real
!rtificial or
real
Pro*0 or direct

Finally# S2 is 3 when &3 performs etter than &9# -3 when &9 performs etter than &3 and :
when they perform the same! 2n this value# we need to add some thoughts: what if &3
performs etter# e!g! in terms of effectiveness# ut &9 performs etter in terms of efficiencyL
&he answer is immediate: for each performance attriute# we produce a different score!
&herefore# the score refers only to efficiency or only to effectiveness! -onseGuently# we have as
many dimensions in the impact vector space as the numer of performance characteristics we
want to consider multiplied y the numer of possile pairs of testing techniGues!
Based on this formulation# for each study we can otain a SSS per testing techniGue
performance characteristic! 'owever# as previously stated# for each pair of testing techniGues
and for each performance characteristics we want to have a single gloal score! @e otain such
,loal Score .,S/ through the following formula: ,S.&3#&9/M `SSS.&3#&9/ Jastudies# i!e! the
average value of SSSs!
"s an e*ample# we report the impact vector generated y accounting only the results of the
study in X93Y on state chart coverage vs code coverage! $n that study# the output is that the two
techniGues perform the same in terms of percentage of removed defects# so the value for that
dimension .relative goodness in terms of effectiveness/ is H&3 eGuals to &9I/! Machine time is
accounted for# so the value of the corresponding dimension is H1nknownI! 'owever# a lower
''3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

level of detail is considered concerning the effectiveness: the authors of X93Y investigate the
type of defects detected y each techniGue! "s reported in &ale 9# we have a specific set of
dimensions taking that into account! "lso for this dimension# there is an issue: different studies
may use different defect ta*onomies! 'owever# something more o)ective is the
complementarity of techniGues: if a study detects that two techniGues find different defects#
then this is the information we want to store in our impact vector! $n particular# in X93Y# authors
find that comining state chart coverage and code coverage improves the effectiveness of the
verification process# as they detect different defectsF conseGuently# comining them leads to a
higher percentage of detected defects! &his information is stored in the HcomplementaryI
dimension of the impact vector .its scale is HhighI# HlowJnoI and HunknownIF o)ective criteria
to select the value are also defined/!
2ne can easily visualiEe the results of this review in the form of a sGuare matri*# where rows
and columns are the techniGues# while the cells contain the gloal scores for each pair .rows vs
columns/! &he matri* is symmetric and the cells on the diagonal do not make sense! "n
additional visualiEation help is the color: since complementarity of defects is very relevant# we
color the cells in yellow when detected defects are of different nature# lue when they are of
the same nature# and white when no information on the defect nature is availale! "n e*cerpt
of the matri* we generated is reported in Figure AA!
2ne more aspect to consider is that if techniGue &3 performance etter .for some performance
attriute/ than &9 and &9 performs etter than &># then we can also say that &3 performs etter
than &>! $n this case# we do not have any direct empirical studies for the pair S&3# &>T# ut we
can still insert something in the matri*# even though no scores can e computedF therefore# we
insert a plus or minus sign when we can say# y transitivity# that the techniGue on the row is
etter or worse# respectively# than the one on the column!
So far# we identified around A: impact vectors# with a total of around A:: pair-wise
comparisons across all testing techniGues we analyEed and listed in &ale >! For each impact
vector .techniGue/# we gave value to each dimension which was accounted for in the study or
that could e o)ectively inferred! Slight variations of a techniGue are still accounted as a single
techniGue and such singularity is also stored in an ad hoc dimension of the impact vector!
SummariEing# in order to uild an empirically-founded aseline of performance of testing
techniGues# we had to go through a long e*amination of the literature! @ithout the use of the
$mpact 0ector Model# the knowledge acGuired during such e*amination would have een
written in a report or# worst case# kept implicit y the people performing the review! 0ia $mpact
0ectors# not only the knowledge is made e*plicit# ut it is also well-structured# machine-
e*ecutale# maintainale# and reproducile!
''4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure AA E*cerpt of the testing techniGue comparisons
2ne could argue that this procedure is very e*pensive# in particular when replicated for all su-
processes of the development process! E!g!# what aout design techniGuesL Do we need to
perform a similar investigationL "ctually# that is true: such analyses can turn out very
e*pensiveF on the other side# we elieve that the testing e*ample we reported aout is one of
the most e*pensive# ecause software literature is full of researchers and practitioners
proposing their techniGues and reporting their e*periments in deeply different ways! For other
parts of the development# e!g! for software inspections or safety cases# the literature can e
much smaller and# conseGuently# cheaper to e*plore! "dditionally# in general# it is not
mandatory to make such studies for all possile activities: the construction of impact vectors
can e incremental and new dimensions can progressively e added to the space whenever
they ecome relevant! 2f course# the sooner the dimensions are identified# the earlier
corresponding data can e collected and used for process improvement! Finally# the analysis of
literature to uild aseline $mpact 0ectors is also optional: having such aseline enales
leveraging a measurement-ased approach to process enhancement since the eginning#
ecause internal teams can leverage e*ternal teams+ e*perience! 'owever# if such aseline
does not e*ist# people can still process as they use to do in most conte*ts where date are not
eing used for process improvement or are not even collected! 2vertime# data collected within
the company will give irth to an internal repository of $mpact 0ectors! Such repository would
proaly have higher degree of internal consistency and will produce recommendations and
results very reliale for internal teamsF on the other side# it would proaly e much smaller
and with many more gaps with respect to a repository orn from the aggregation of data
coming the literature or# even etter# from the collaoration of many companies sharing their
data in a centraliEed repository .see Section 0$# Software Engineering 9!:/!
''- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Chapter XVIII. Se8encing #erification Techni8es: an
E&periment
2ne of SGua+s elieves is that changing the order of application of verification leads to
drastically different results! 'owever# accounting for all reasonale seGuences of verification
activities may turn very e*pensive! &herefore# efore starting to collect data# SGua decides to
make sure whether the order of verification techniGues has some impact on the performance
of the verification! Since software literature does not provide much information# he reGuires
some form of e*perimentation!
$n this chapter# we report on an e*periment that investigates the performance of verification
processes composed of the same verification techniGues e*ecuted in different orders!
Specifically# we report on two controlled e*periments that $mpact-2ne commissioned to usF the
two e*periments focus on seGuences of two verification techniGues: perspective-ased
inspection and unit testing! 5esults suggest that the performance of a verification process# in
terms of recall and defect detection rate# strongly depends on the order of application of the
techniGuesF therefore# interactions and synergies among verification techniGues produce
Guality- and cost-relevant effects on the overall verification process that a decision maker
should take into account!
!"###$i$%$ #ntroduction
2rganiEations aiming to develop highly reliale software systems# e!g! Defense or Space
companies and agencies# clearly rely on a highly effective verification process and put a lot of
effort in its definition! "lso for organiEations in other domains# where the reliaility is desirale#
an efficient and well-working verification process is recommended in order to maintain an agile
and fle*ile process while assuring the adeGuate Guality! Defining such a verification process
includes decisions on the verification techniGues to apply# on their order of e*ecution and on
the effort to allocate!
Several verification techniGues have een e*tensively studied in literature and adopted in
industry! Moreover# some cominations of more than one verification techniGue have een
defined# investigated and applied# as descried in more detail in the ne*t Section! $n this paper#
we want to study seGuences of verification techniGues# as opposite to cominations X99Y# this
last eing a term that rings to the independence from the order of application X9>Y! 2ur
hypothesis is that the order of e*ecution of the verification techniGues has a significant impact
on the overall verification process performance! -onseGuently# decision concerning the
verification process should e made while keeping into account synergies and mutual
''2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

interactions among verification techniGues# so to ma*imiEe the enefits that each techniGue
can get from techniGues applied prior to it!
$n the following paragraphs# we will report on our e*perimental investigation on two research
Guestions! &he first one is 5=!3: does the order of application of verification techniGues impact
on the overall verification process performanceL
Furthermore# we know from our e*perience and from $mpact-2ne Gualitative analyses# that the
definition of the verification process is commonly a continuous and iterative task that reGuires
decision makers to define the ne*t techniGues to apply during the developmentF therefore# the
verification process is not defined entirely and solely upfront! Such Guasi-real-time decisions
are made taking into account at least the following aspects: .i/ the already achieved Guality# .ii/
the e*pected Guality# and .iii/ the availale resources! &hus# it is useful to investigate and
address the following further research Guestion: .5=!9/ does the performance of a verification
techniGue depend on the previously applied techniGuesL 5=!9 is fundamentally different from
5=!3! $n fact# while 5=!3 addresses a planning prolem .i!e! the definition of the most
performing verification process for the current pro)ect/# 5=!9 addresses a control prolem:
given the current state of the pro)ect# which verification techniGue.s/ should e applied ne*t in
order to achieve the e*pected Guality goalsL
$n order to answer the two previous research Guestions in a concrete scenario# we report on
two controlled e*periment taking into account two verification techniGues: 1nit &esting X99Y
and Perspective-Based $nspection X9A# 9BY! Because of the limited availale e*periment
su)ects# we narrowed our study to one type of artifact# i!e! Zava code# and to the
aforementioned verification techniGues!
"s we will e e*plaining in the ne*t paragraphs# results show the importance of accounting for
the order of application of verification techniGues# as different orders lead to statistically
different results! &herefore# one of our conclusions is that a strategic approach in defining the
verification process is highly recommended# e!g! the one we proposed in X9;# 9<Y# ecause it
would focus the prolem of ordering the techniGues rationally# and would allow leveraging the
mutual influences etween techniGues! $n particular# we will show how a strategic vision in
defining the verification process can support decision makers addressing diverse practical
prolems efficiently# e!g!: given a predefined udget# which process ma*imiEes the
performance within that udgetL ,iven a performance goal# which verification techniGue
should e applied to minimiEe the effort reGuired to achieve the goalL
''9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

!"###$i$,$ (elated 5ork
" wide literature remarks the e*tensive importance for a well-defined verification process X94#
97# >:# >3# >9# >># >AY and many empirical studies# comparing verification techniGues to
evaluate their performance# date ack of many years X>># >A# >># >A# >B# >;# ><# >4# >7# A:# A3#
A9# A># AA# AB# A;Y! 5esults reported y these papers are sometimes contradictory and report
strongly different performances for the same techniGuesF such differences may derive from a
myriad of causes and e*perimenters struggle to control factors that can potentially distur their
specific investigations! "s we will e reporting# one of such factors can e the verification
techniGues e*ecuted upfront the investigations on the e*periment o)ects!
Surveying the literature also revealed some papers addressing the prolem of comining
verification techniGues! $n particular# some of them assert that the overall verification
performance may enefit from the comination of different verification techniGues X99# ABY#
and some others propose and e*perimentally investigate concrete cominations XA<# A4# A7# B:#
B3# B9# B># BA# BB# B;# B<# B4# B7# ;:# ;3# ;9# ;># ;AY# including the comination of inspection
and testing XA<# B;# B4# ;9# ;># ;AY! 'owever# such studies mainly focus on the capaility of
different techniGues to detect different defectsF conseGuently# their comination results in a
higher numer of detected defects for reasons intrinsic to the techniGues eing used# i!e! their
capaility of targeting particular defect types# so that their union gives rise to a igger set of
found defects!
Some more studies go further and propose more elaorated cominations# consisting of a
numer of verification techniGues to e applied in pre-planned seGuences XAA# A<# B7# ;:YF the
presence of such pre-planned seGuences implies that the order of application may have a
disruptive importance# generally ecause the output of a techniGue is reGuired as input to the
ne*t techniGueF conseGuently# violating the order would turn the process unfeasile!
Moreover# slightly different approaches attempt to ad)ust and address testing y leveraging
information gathered via inspections or other analysis techniGues XA<Y# which recall to some
form of order on verification techniGuesF nevertheless# in these latest studies# techniGues to e
applied first are not used for detecting defects# ut mainly for test case selection and
prioritiEation! "dditionally# in the past# an iterative approach for systematic selection of
verification strategies was proposed in XA7Y# where the authors also sketch the possiility that
the order of application of verification techniGues might influence the cost-effectiveness of a
verification strategy# ut they do not report any specific empirical study!
Based on this literature survey# the idea of seGuencing treatments seems to e Guite felt in the
field of Software Engineering# and its importance has already een recogniEed in scientific
'', | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

research also in other fields X;BYF few e*amples of the so called HseGuencing treatmentsI e*ist
not only in Medicine X;;Y# ut also in Psychology X;<Y# Psychiatry X;4Y# and 'uman Sciences X;7Y!
'owever# not much emphasis has een put on the e*perimentation and comparison of
different and feasile seGuences of verification techniGues in Software Engineering# which is
the goal of our investigation! $n fact# in our literature survey# we could identify only one study
doing so X;AY! $n that study# authors compare two different seGuences of code inspection and
unit testing! $n our study# we propose again the comparison etween seGuences of the two
same techniGues# ut we also introduce some changes in the e*periment settings! &he first
difference is the reading techniGue K usage scenarios for X;AY# perspective-ased for the
present study! &he second difference is on the time allocated for each task: unalanced in X;AY
.9h>:m for testing# 3h>:m for inspection/# alanced in our e*periments .3h for each
techniGue/! " third and most important difference is that# in X;AY# the same team e*ecutes oth
techniGues in a row# while# in our e*periment# two different teams e*ecute two distinct
techniGues .one each# in seGuence/ and the first team hands off the produced list of defects to
the latest team! Further details motivating our design choices are provided in the ne*t Sections#
along with results of our investigation!
!"###$i$-$ E&periment Description and Design
E&periment ;verview
&he purpose of this study is to evaluate the effect that ordering verification techniGues has on
the verification process performance# from the point of view of a decision maker planning for or
controlling the verification process!
&he e*periment factor to investigate is a two-step verification process S&3# &9T# where the
synta* S&3# &9T denotes a seGuence of two verification techniGues and means that the
techniGue &3 is to e applied efore the techniGue &9! Since we focus on Perspective-Based
$nspection .PB$/ and 1nit &esting .1&/ on code artifacts# we define the following treatments#
each corresponding to a Process Performance $mpact 0ector .PP$0/: SPB$# 1&T# S1&# 1&T# S1&#
PB$T# SPB$# PB$T! &he main focus of the e*periment is on comparing SPB$# 1&T against S1&# PB$T#
which will address 5=!3! 'owever# in order to address 5=!9# we made the decision to include
S1&# 1&T and SPB$# PB$T as a way to assess the performance of the single techniGues within a
treatment!
$n order to have comparale results etween 1& and PB$ .i!e! to have a fair comparison/# 1& is
intended to include defect identification# i!e!# defect detection and suseGuent defect
localiEation in the code! &his way# PB$ and 1& produce the same type of information X><Y# i!e!
they detect the presence of a defect and its location in the code!
'': | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&he ma*imum time allocated to each techniGue is one hourF the duration of the stages is
alanced and# conseGuently# the treatment time is fi*ed to a ma*imum of 9 hours! Finally# we
planned to e*ecute the e*periment twice# the second time as a replica of the first time with
different su)ects# in order to reinforce the study validity!
Metrics
&he measures of performance we consider are: 5ecall .5/# defect Detection rate .D/# and
percentage of False positives .F/!
5 is defined as the ratio of found defects to the numer of known defects in the artifact! $t
represents a way of measuring the effectiveness of the techniGue!
D is intended as the ratio of found defects to the time spent for finding them! $t adds cost
considerations to the effectiveness and we use it as a way of measuring the efficiency of the
techniGue!
F is the ratio of numer of false positives to the numer of report records .i!e! the one-
complement of precision/F a false positive is a detection which does not elicit a defect! &his last
metric is relevant ecause each false positive# as well as each defect# provokes some rework to
e performed in the futureF however# rework caused y a false positive should e minimiEed# as
it does not produce eneficial effect! @hether a report record is a defect or a false positive is a
decision of a software engineer not otherwise involved in the e*periment!
&hese > metrics are oviously three dimensions of the impact vectors!
2onte&t Selection and Defect Seeding
Su)ects are 34 graduate students with e*perience with small- and medium-siEed pro)ects! &hey
took classes in Software Engineering# 5eGuirement Engineering# 2)ect-2riented "nalysis#
Design and Programming# and Software "rchitecture! Su)ects also received specific training on
the testing and inspection fundamentals and procedures to e used in the e*periment! Su)ects
involved in each e*periment were 7! &he rationale for selecting such su)ects is the small cost
and the availailityF in fact# the e*periments run in the academic conte*t# during class hours of
the E*perimental Software Engineering course# a teaching of the second year of the Master+s
degree in -omputer Science and Software Engineering at the 1niversity of 5ome K &or 0ergata!
&he e*periment artifacts were portions of an academic pro)ect developed during a previous
course in the first year of the same Master+s degree! &he application is we-ased# it is
composed of around BB:: lines of Zava code# and it manages university courses# student
enrolments# e*ams# e*am sessions and several user profiles! %one of the e*periment su)ects
worked on that pro)ect efore! Different units of such pro)ect were used during each session of
'2( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

the e*periment! 2)ects were selected in order to achieve the ma*imum homogeneity in terms
of comple*ity and siEe!
-onsidered structure# siEe and comple*ity of the code# our decision was to seed seven defects
per artifact! &he average siEe of code artifacts is ;:< E62-# thus the average defect density is
3!3BV X<:# <3# <9# <>Y# which should characteriEe the software Guality of the e*periment
artifacts as poor X<AY! Since we wanted to reproduce the first verification task in the
development process .i!e! the first verification techniGue applied right after writing the code/#
this density seems to us oth realistic and adeGuate for not getting the su)ects e*posed to too
high a numer of defects# which could cause the su)ects to get lost in not-working software!
&he defect seeding procedure consisted in introducing .i/ initialiEation defects and .ii/
mathematical and logical mutants X<BY!
E&periment Design
&he four treatments .i!e! S1&# PB$T# SPB$# 1&T# SPB$# PB$T# and S1&# 1&T/ were administered y
providing su)ects with two techniGues in two seGuential stages# whose duration is up to one-
hour each! &eams were allowed to complete earlier than one hour# in case all team memers
agreed upon considering the task completed! &his way# we wanted to measure the efficiency of
each techniGue ased on the actual time spent to accomplish the task!
&he identification of the detected defects was reGuested# ut not their correctionF
conseGuently# the output of each treatment provided the same amount of information# that is:
.i/ the list of defects detected with respect to the given reGuirements specifications# .ii/ the
duration of the e*ecuted techniGue# and# .iii/ for each defect# the line.s/ of code where the
defect was located!
For each su)ect# a use case description and a seGuence diagram were provided as specification
of the functionality to e*amine# in addition to the related Zava code! @hen performing 1&#
su)ects were instructed to create a set of test cases and write some lines of code .i!e! test
drivers/ for testing the functionality! "lso# there were e*pected# and enforced y the oservers#
not to read the code efore writing all test cases!
@hen performing PB$# instead# su)ects were e*pected to read the code# focusing on the
assigned perspective# a non-compilale version of the code was provided# so to disallow them
to e*ecute the unit! Based on the artifacts to e inspected# the following two perspectives were
defined and assigned: ."/ $ntegration perspective# which takes into consideration: parameter
passing# validity of input parameters# compatiility of interfaces# and compatiility and use of
output parametersF .B/ 1nit perspective# which takes into consideration: logical correctness of
'2' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure AB E*periment design sketch

Figure A; E*periment design sketch
functions .i!e!# in o)ect-oriented terms# classes and odies of their methods/# correct
elaoration of input parameters# and correct generation of outputs!
Because of the limited numer of su)ects# we had to limit the numer of perspectives to two!
&he assignment of the two perspectives is random within the team# with the constraint that
each of the two perspectives is implemented the same numer of times y each team memer
during the e*periment! &his way# we plan to mitigate the influence of each su)ect+s skills on
the e*periment outcome!
Su)ects performing PB$ worked in a team
composed of two people# one perspective
per memer# while teams performing 1&
were composed of one su)ect! &eams were
not allowed to directly interact and
communicate at any time!
@hen the treatment is SPB$# PB$ T or S1&#
1&T# the same team works on the same
artifact for oth stages! @hen the treatment
is S PB$# 1&T or S1&# PB$ T# the team
assigned to the first stage is different
from the team assigned to the second
stage! &his last will receive the list of
defects produced y the first teamF the
list is anonymous# so there is no way for
the second team to find out .and
possily interact/ with the first team!
Figure AB and Figure A; provide a
graphical e*planation of the e*periment
design! $n the upper part of Figure A;#
the same team e*ecutes twice 1&#
whilst# in the lower part# two different
teams e*ecute the two different techniGues!
&he motivation for such design is that# in a real conte*t# in particular for systems where
verification is reGuired to e very accurate .e!g! Space and Defense systems/# it is freGuent and
reasonale to e*pect different teams to perform code inspections and unit tests! $n our study#
we wanted to reproduce such a conte*t!
'22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

" final remark on the design is that no specific guidelines were provided on how each team was
e*pected to use the list of defects received y the previous teamF therefore# the team e*ecuting
the second techniGue of the process was free to evaluate each record of the received list and
estalish whether it was a defect or not! $n case it was# the team reported that record in its own
reportF otherwise# the record was discarded and was not part of the verification process output!
Su)ects used their own laptops# where they had previously installed Eclipse 'elios X<;Y# and
Zava EE ; X<<Y# with which all of them were familiar!
Mitigation of the #mpact of Sub:ects< Skills on the E&periment
Before each e*periment# we ran a pilot study to assess the capaility of the su)ects at
identifying defects with PB$ and 1&! Based on the results of such pilot studies# we created three
groups of three su)ects for each e*perimentF the rationale for assemling groups was to
minimiEe the variance among group capailities# so to minimiEe impact of su)ects+ individual
capaility on the e*periment results!
&he two su)ects constituting each PB$ team are picked up from the same group# while the third
su)ect of the group performs 1& on the same artifactF in the ne*t step# the su)ect who
performed 1& performs PB$# together with one of the two su)ects previously involved in PB$#
on a different artifact! &hen# the su)ects who performed twice PB$ performs 1&# while the
remaining two su)ects perform PB$ on fresh artifacts! &his way# after three rounds# there will
e > artifacts# as e*emplified y Figure A;# on which the group .as a whole/ has performed oth
PB$ and 1& .with different su)ects/ and# thus# has produced the list of detected defects! Such
artifacts and the corresponding identified defects are then provided as input for the teams that
perform the second techniGues of the treatments! $n particular# as aforementioned and
represented in Figure A;# if the randomly selected artifact for the treatment SPB$# PB$T is "3#
than su)ects 3 and 9 will perform one more hour of PB$ on "3F if "9 is the artifact selected for
S1&# PB$T# then it will e delivered to a su)ect not elonging to the same group of su)ect 9#
together with the list of defects produced y su)ects!
Design "alidit)
&o e as close as possile to a real environment# during the e*periment design# we paid
attention not to have the same team work on the same artifacts with oth techniGues .PB$ and
1&/! $n general# in fact# 1& may e not performed .and# in our reference customers# usually it is
not/ y the same team performing PB$ and vice versa! 6etting the same team perform oth 1&
and PB$ on the same o)ect could have iased the e*periment results: in such a case# in fact# the
performance of the second techniGue of the treatment could enefit from the knowledge
'23 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

gained during the e*ecution of the first techniGue# which would cause the performance of the
second techniGue to e iased!
&he participation in the e*periment was optional and the su)ects were not evaluated ased on
the e*periment results! &o make the effect of maturation over all treatments homogeneous#
the order in which treatments were e*ecuted was alanced across time .i!e! each treatment
had to e applied at least once efore any treatment eing applied for the second time/!
&herefore# randomiEation was used# ut in respect of such constraint!
,eneraliEation of results to an industrial conte*t is limited due to the fact that .a/ graduate
students rather than professionals participated in the study# and ./ e*periment o)ects were
different portions of the same software pro)ect!
Since the goal of this study is not to identify the est verification process# ut rather to provide
evidence on the importance of a correct and well-reasoned seGuencing of verification
techniGues in each conte*t# we consider acceptale such validity issues!
!"###$i$.$ (esult Anal)sis
&he first Guestion to answer is 5=!3: does the order of application of verification techniGues
impact on the verification strategy performanceL $n order to answer# we need to compare the
performance of the two treatments SPB$# 1&T and S1&# PB$T! First we run normality tests: p-
values of the Shapiro-@ilk test are reported in &ale $ for oth e*perimentsF the tests show that
most samples are normally distriuted# e*cept in > cases! &he chosen alpha is :!:9B and
significant p-values are olded and underlined!
TABLE I. TESTS OF NORMALITY (SHAPIRO-WILK TEST)
EE<eri;ent Treat;ent
<0values of the Sha<iro0%ilB tests
Recall #R$ Defect Detection Rate #D$ Percentage of False Positi%es #F$
3
SPB$# 1&T
:!<B; :!;9> :!:;B
S1&# PB$T
:!AB4 :!<93 (.((:
9
SPB$# 1&T
(.(', :!<>A :!:44
S1&# PB$T
:!<9; :!7;4 I(.(('

Samples are independent y e*periment design .different artifacts were used and teams were
not allowed to directly interact/F therefore we can run the independent t-test .parametric/ and
the Mann-@hitney test .non-parametric/# according to normality of samples .see &ale $/! &he
conservative Bonferroni correction is used for assessing the significance and results of the
'24 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

comparisons are reported in &ale $$ and show that a statistically significant difference e*its in
terms of 5 and D# ut not in terms of F!
TABLE II. COMPARISONS OF <PBI, UT> VS <UT, PBI>
EE<eri;ent
<0values of the =o;<arisons ?in>e<en>ent t0test or Mann0%ithneF test@
Recall #R$ Defect Detection Rate #D$ Percentage of False Positi%es #F$
3
(. ((9 (. ((2 3!:::
9
I(.((' (. ((3 :!97<

For completeness# we computed the power of the tests: in the first e*periment# the means are
:!;<7 and :!B9: for 5 .SPB$# 1&T and S1&# PB$T# respectively/# and 9!3:< and >!9>; for DF the
common standard deviations are :!333 and :!;34# alpha is :!:9B and each sample siEe is 7!
&herefore# the test powers are :!<7 for 5 and :!7B for D! $n the second e*periment# the means
are >!>9: and 3!;>; .SPB$# 1&T and S1&# PB$T# respectively/F the common standard deviation is
:!;<4# alpha is :!:9B and each sample siEe is 7! &herefore# the resulting test power is aout 3 in
oth cases!
&he second Guestion to answer is 5=!9: does the performance of a verification techniGue
depend on the previously applied techniGuesL
$n order to answer# we need to verify if any differences e*ist in the performance of PB$ .or 1&/
when its history changes .i!e! different techniGues are applied prior to it/! $f performance does
change# then we have evidence that the performance of PB$ .or 1&/ can depend at least on the
previously applied techniGue .i!e! PB$# 1& or none/!
@e want to investigate the following comparisons# where the asterisks .W/ indicate the
techniGues whose performance are eing compared: SPB$W# LT vs SPB$# PB$WT vs S1&# PB$WT and
S1&W# LT vs S1&# 1&WT vs SPB$# 1&WT!
$n order to compare the results# we need to define 5 and D for a single techniGue! 5 is
computed as follow: if the techniGue is the first of the strategy# then 5 is defined as the ratio of
found defects to the numer of known defects! $f the techniGue is the second of the strategy#
then 5 is defined as the ratio of new defects found during the second stage to the difference
etween known defects and defects detected during the first stage! "ccordingly# D is computed
as follow: if the techniGue is the first of the strategy# then D is defined as the ratio of found
defects to the hour of effort reGuired y the techniGue! $f the techniGue is the second of the
strategy# then D is defined as the ratio of new defects detected during the second stage to the
effort reGuired y the techniGue!
'2- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&he first test to run is to check the homoscedasticity of samples# which would enale the use of
2ne-@ay "n20"! $n the first e*periment# significances of the 6evene+s test for 5# D and F are
!:;A# !>;: and !:::# while in the second e*periment they are !9A># !>:7 and !::A# i!e! only the
variances of F are homogeneous! &herefore# "n20" can e used for the comparison of F
samples# ut not for 5 and D# where Druskal-@allis can e used! @hen a difference is detected
y "n20" or Druskal-@allis tests# additional investigation is useful: in fact# in order to
understand which pairs of treatments are different in terms of 5 and D# Mann-@hitney tests
can e performed! Similarly# a post-hoc analysis .Bonferroni test/ can e used on F samples#
when the "n20" test detects a statistically significant difference! P-values of such pairwise
comparisons are reported in &ale $$$# &ale $0# &ale 0 and &ale 0$F cells with a dash .-/
indicate that no statistically significant difference has een detected y "n20" or Druskal-
@allis .therefore no pairwise comparison is needed/F symols HTI and HSI indicate the direction
of the pairwise diseGuality! "lpha is always :!:9B and significant p-values are olded and
underlined!
TABLE III. PERFORMANCE OF PBI: 1
ST
EXPERIMENT
IP)IJ3 .K vs IP)I3 P)IJK vs
I*T3 P)IJK
<0values of the =o;<arisons an> >ire=tion
Recall #R$ Defect Detection Rate #D$
Percentage of False
Positi%es #F$
SPB$W# LT vs SPB$# PB$WT 0 I(.((' ?I@ (.((,
SPB$W# LT vs S1&# PB$WT 0 I(.((' ?I@ (.((,
SPB$# PB$WT vs S1&# PB$WT 0 b3!::: (.((,
TABLE IV. PERFORMANCE OF PBI: 2
ND
EXPERIMENT
IP)IJ3 .K vs IP)I3 P)IJK vs
I*T3 P)IJK
<0values of the =o;<arisons an> >ire=tion
Recall #R$ Defect Detection Rate #D$
Percentage of False
Positi%es #F$
SPB$W# LT vs SPB$# PB$WT - I(.((' ?I@ -
SPB$W# LT vs S1&# PB$WT - (.((9 ?I@ -
SPB$# PB$WT vs S1&# PB$WT - :!3:7 -
TABLE V. PERFORMANCE OF UT: 1
ST
EXPERIMENT
I*TJ3 .K vs I*T3 *TJK vs
IP)I3 *TJK
<0values of the =o;<arisons an> >ire=tion
Recall #R$ Defect Detection Rate #D$
Percentage of False
Positi%es #F$
S1&W# LT vs S1&# 1&WT I(.((' ?K@ I(.((' ?I@ -
S1&W# LT vs SPB$# 1&WT (.((' ?I@ I(.((' ?I@ -
'22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


FiCure 49 )oE <lots for re=all an> >efe=t >ete=tion rate3 first an>
se=on> eE<eri;ent
I*TJ3 .K vs I*T3 *TJK vs
IP)I3 *TJK
<0values of the =o;<arisons an> >ire=tion
Recall #R$ Defect Detection Rate #D$
Percentage of False
Positi%es #F$
S1&# 1&WT vs SPB$# 1&WT :!:37 :!:B: -
TABLE VI. PERFORMANCE OF UT: 2
ND
EXPERIMENT
I*TJ3 .K vs I*T3 *TJK vs
IP)I3 *TJK
<0values of the =o;<arisons an> >ire=tion
Recall #R$ Defect Detection Rate #D$
Percentage of False
Positi%es #F$
S1&W# LT vs S1&# 1&WT - :!:A7 -
S1&W# LT vs SPB$# 1&WT - I(.((' ?I@ -
S1&# 1&WT vs SPB$# 1&WT - (.((' ?I@ -
!"###$i$/$ (esults #nterpretation
Based on the reported results# we can
detect significant differences oth in terms
of 5 and D when switching the order of
e*ecution of PB$ and 1&! &his means that# in
the conte*t under study# a verification
process strongly feels the effect of changing
the order of application of techniGues!
,iven such considerations# we can start
reasoning on a numer of practical
prolems# e!g! given a predefined udget#
which verification process ma*imiEes the
e*pected performanceL "s shown y our
statistical analysis# different conclusions can
e drawn# depending on the conte*t the
verification process is eing planned for! $f
the goal of the verification process is to
ma*imiEe the percentage of detected
defects with respect to the e*isting defects
.i!e! to minimiEe the numer of remaining
defects/# then the verification planner
should prefer SPB$# 1&T to S1&# PB$T# as it
'29 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

has the highest recall .see Figure A<# plot a and plot /! &his can e the case of safety-critical
systems# where the goal is generally to remove as many defects as possile! %evertheless# in
different conte*ts# cost-effectiveness aspects can play a ma)or role and the defect detection
rate could e the primary criterion to select the verification processF in such conte*ts# ased on
our results# the decision would e uncertain# ecause results of the two e*periments for D are
contradictory .plot c vs plot d/! 'owever# as already mentioned# the goal of this study is not to
conclusively identify the est seGuence of 1& and PB$ on code artifacts# ecause at least larger
samples# professionals with diverse e*pertise# and a real-world-pro)ect would e reGuired for
such a conclusion! -onversely# this part of the study wants to show the importance of the order
of verification techniGues constituting a verification process# as it can make a choice to prevail
over another !
$n conclusion# the answer to the first research Guestion is that# ased on our results# the order
of application of verification techniGue can have a remarkale impact on the verification
process performance# and therefore it should e carefully considered when planning a
verification process!
-oncerning 5=!9# results reveal that very often the performance of a verification techniGue
changes when the history of the verification changes! $n oth e*periments# it results that the
defect detection rate of a techniGue .either PB$ or 1&/ varies when it is applied as first or
second techniGue of the verification process: this is true in 4 cases out of 39 comparisons!
'owever# variations are less freGuent in terms of 5 .9 times out of 39/ and F .> times out of 39/!
$n rief# we can say that# for the conte*t under study# the efficiency .i!e! D/ of unit testing and
perspective-ased inspection depends on which techniGues were applied efore themF
conversely# the overall recall of the verification process# as well as percentage of false positives#
tend to e unchanged most of the times!
!"###$i$0$ 2onclusion
$n this paragraph# we addressed two research Guestions related to the definition of a
verification strategy: .3/ does the order of application of verification techniGues impact on the
verification strategy performanceL .9/ Does the performance of a verification techniGue depend
on the previously applied techniGuesL
&wo controlled e*periments were performed to investigate such Guestions! SynthesiEing the
otained results# we can affirm that the verification process performance depends on the
seGuence in which verification techniGues are applied and# more in detail# the performance of
each techniGue can change ased on the history of the verification process! 5esults supporting
these conclusions ring up one more aspect of comple*ity to the attention of a verification
'2, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

planner! $n fact# they show that# in order to undertake fully informed decisions# a verification
planner has to include in hisJher reasoning considerations on the order in which verification
techniGues will e appliedF in the act of controlling an on-going process# furthermore# the
verification planner should then make decisions also ased on which techniGues were already
applied and should use historical measures carefully: any given verification techniGue may
perform differently from the past# if applied at a different stage of the verification process# if
the current verification process is a different seGuence of verification techniGues with respect
to the past!
5esults should encourage SGua and $mpact-2ne to further investigate how to correctly
seGuence verification techniGues# in order to empirically elicit functions which model the
mutual influences among techniGues .i!e! their comination functions/ and# conseGuently# their
e*pected performance .i!e! their activity performance functions/!
Chapter XIX. Test Performance E*alation
" couple years are gone since the company led off the new software group# along with the
innovative approach of impact vectors! Management has invested a lot of resources in the new
software intergroup and it is time to start evaluating how procedures and heuristic put in place
are ehaving! $n this chapter# we focus on the investigation of how test is performing with
respect to detected defect types! $n particular# $mpact-2ne has created repository of >3<7 ugs
detected y a slice of the software group across 3: pro)ectsF this slice is composed of: the
architect# an analyst# a designer# and three programmers and they form a team working with
we technologies# i!e!# within the company# the serverJclient components enaling physicians
to access the repository that contains all patients data# and which supports them at making
their diagnoses!
$f we focus on the ug repository# we can find B> impact vector dimensions that characteriEe a
ug# among which we want to mention the following: Status# Priority# 2riginal Effort Estimate#
"ctual &ime Spent# Security &hreat# Defect -ategory# Defect Severity# Phase of Detection#
$teration %umer within the phase# and &oolJMethod 1sed for detection!
$n the rest of this chapter# we will report on the most relevant and interesting results of the
analysis of the impact vectors!
'2: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

/"/%i% $nal'sis of Reslts
&he first dimension that Zohn Soft .Software Development 6ead/ and SGua .Software =" lead/
want to investigate is the defect type! 1pfront# the software division identified a ta*onomy of
3A defect types:
5eGuirements defects:
o Misunderstood reGuirement
o Missed reGuirements
-ode defects:
o 6ogic
o Formatting
o -ontent
o '&M6
o Data
"ccess
Migration .etween servers/
Security defects
-onfiguration defects
Supplier defects
1saility defects
"ccessiility defects
%on-standard compliance
Figure A4 Distriution of ugs across defect type
'3( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


-lassification is orthogonal! "s evident from Figure A4# most of the defects are code-related# in
particular on the code logic! Since the we application consists of a ig amount of ,1$ code# it is
not a surprise that many ugs fall in user interface-related categories# e!g! formatting or '&M6!
'owever# the distriution of defect types does not convince the =" team# so they decide to
investigate some groups of related defects across development phases! -oncerning the we
development# we did not descrie in detail the implemented process! Since we are focusing
only on test performance# the following process information is relevant: unit test is performed
during the development .first phase/! "fter development# during system integration .second
phase/# $nternal System &esting .$S&/ is e*ecuted! Eventually# the validation of the system
happens via 1ser "cceptance &esting .1"&/# where the client participates in the test activity
with its perspective! %o other test activities are performed on this category of software .we
software/ within the company# so defects can either e detected during development# or during
$S&# or during 1"& or# worst case# when the software is in production!
&he data we present in the following are aout pro)ects that are not already dismissed# ut
delivered to the customers in the form of stale versionsF therefore# even though the numer
of ugs is not conclusive# it is e*pected to e very close to the real numer of ugs that will e
detected during the entire life cycles of those pro)ects!
$n terms of $mpact 0ectors# therefore# we have A Process Performance $mpact 0ectors# i!e! one
per development phase and each representing a su-process of the software development
process enacted y the teams! &hese su-processes are composed of several activities#
including verification activities that# in the specific ta*onomy of the involved teams# are called
HDetection MethodI# which is# as we mentioned aove# one of the dimensions in the repository
availale to usF the nominal scale for such dimension is composed of the following memers:
Manual &esting# Peer 5eview# Product 1sage# 1$ &esting &ool# and 1saility &esting! For each
activity e*ecution .i!e! the application of any of these verification activities/ some ugs are
detected and reported! &his way# the repository is a collection of ugs characteriEed y the
detection method that found it .e!g! usaility testing/# the defect type .e!g! code logic/# the
detection phase .e!g! production/ and so on! From this repository# we can uild the impact
vectors corresponding to each verification activity and to each development phases! Such
impact vectors for verification activities .i!e! "ctivity Performance 0ectors/ and process phases
.i!e! Process Performance $mpact 0ectors/ are composed of dimensions of interest to
managementy# including: numer of detected defects for each defect typeF numer of actual
defects known as of todayF effort spent to run the activityJphaseF distriution of the severity of
defects detected y the activity! By having such impact vectors# a numer of interesting
analyses is enaled and facilitated# which can lead to actionale feedacks and interesting
'3' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

insights! Some of such analyses# performed y leveraging the part of the tool chain presented in
the previous Section# are reported elow# starting y e*ploring the defects y detection phase!
$f we look at the distriution of defect across their corresponding detection phase# we see
something not really pleasant for Guality assurance people: defects seem to e detected late in
the development! $n fact# 99V of defects are shipped to the customer .in production/# >:V is
found during user acceptance testing# >3V is found during system internal testing and only
<!BV during development .see Figure B:/!

Figure B: Defect Detection Phase
2ne first guess was that defects slipping through were not really relevant# ut )ust minor ugs
reported y picky developers! "ctually# this hypothesis can e discarded when looking at ug
severity# as reported in Figure B3: fatal and high severity ugs are freGuent and around B:V of
them is at least medium# i!e! not something the team can forget aout!

Figure B3 Defect detection phase and severity
'32 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$f we proceed with the analysis y defect type# we detect that different trends with respect to
the gloal trend!
-oncerning reGuirements defects# there is a slippage of BAV from the $S& phase .B7J33: ugs#
i!e! >4 to 1"& and 93 to production/# and a slippage of 47!:7V .74J33: ugs/ during
development: 7 out of 3: defects are not caught during development!

Figure B9 5eGuirements defects across development phases
@hen we advance to accessiility and usaility defects# the trend seems similar: most of the
defects are detected late in development!

Figure B> 1saility and accessiility defects across development phases
'33 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

'owever# while the trend is similar# its interpretation is deeply different! 5eGuirements defects
can and must e detected early in the development and should not slip through! 2n the other
side# many usaility defects cannot effectively and efficiently e removed upfront# whilst
accessiility defects# at a careful analysis# resulted cheap to address even when discovered in
1"& or even in production! &herefore# we can live with a late detection of usaility and
accessiility defects# ut we should definitely enact some countermeasures to target
reGuirements defects!
%e*t step is to investigate data-related defects!

Figure BA Data migration and access defects across development phases
"lso in this case# as for the reGuirements defects# late discovery means much higher costs of
removal and a slippage of <>V .9:3J9<B ugs/ is something to target with particular care! More
specifically# S=1" should identify testing techniGues# oth at unit and integration level# to
detect data-related ugs earlier!
For what concerns ,1$-related defects .i!e! '&M6# content and formatting/# one should not
e*pect to detect many of them during the early phases# as they are refined and tuned mostly
during integration# when comple* we flows are composed to implement complete use cases!
&he trend in Figure BB is Guite e*pectale# even though the numer of defects detected in 1"&
and# in particular# in production# is still high! "t a closer analysis# aout 7:V of the defects
detected in the last two phases reGuires less than 3 hour to e fi*edF this means that the vast
ma)ority of issue reported at that time consists of tweaks# for which reasonale and convenient
techniGues for early detection are not known to us!
'34 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure BB ,1$ .'&M6# -ontent and Formatting/ defects across development phases
6ast slice of analysis is on code defects# in particular logic defects# which represent a significant
part of all defects! Data are pretty clear: 73V of the ugs .39>BJ3>B4/ are slipping through the
development phase and ;:V .433J3>B4/ is even slipping through $S&! "fter this warning was
raised# management started to interview developers and to e*plore a significant sample of the
defects to etter understand their real nature! 1nfortunately# even though some
misclassifications e*ist# most of the defects are correctly classified# which means that Guality
control .and in particular unit testing/ is not really working efficiently for we pro)ects! 2n the
other side# however# the mentioned investigation also revealed that many logic defects
detected during development are not eing recorded! &his is not necessary something ad:
recording defects has a cost and# if such knowledge does not produce useful information# then
collecting data ecomes a waste! "lso# the investigation revealed that many logic defects are
corrected in )ust a few minutes and only aout one third reGuires more than one hour!
Based on this# on one side# instructions were given on how and when record ugs detected
during development! $n short# fast-to-solve ugs .S:!Bh/ can have a fast-track# where )ust a
couple of fields need to e filled in the ug report formF the other ugs must e recorded in
full! &he goal of this new procedure is produce more realistic trends! 2n the other side# the
inefficiency of unit testing is still a fact# ecause realancing the proportions across detection
phases does not solve the issue of having a ig .asolute/ numer of defects eing shipped .or
at least e*posed during 1"&/ to the customer! &herefore# results reported in chapter 80$$ were
leveraged in order to prepare a series of training sessions aout unit testing techniGues and
approaches!
'3- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


-oncerning an effort analysis# only ;;9 records .9:V of total/ report effort informationF in
particular# data were collected only during some pro)ects! &his is something negativeF in fact# in
our conte*t# one of the most meaningful ways of measuring the efficiency of a 0\0 techniGue is
the defect detection rate: augs found J effort spent! 'owever# no information aout detection
effort# localiEation effort and correction effort are eing collected and distinguished! "s an
e*ample# consider manual testing .the most used techniGue/F via manual testing# A:: ugs were
detected and B3< person-hours .ph/ were logged for those ugs! So# one could say that the
efficiency for manual testing is A:: ugs divided y B3<ph# i!e! :!<< ugsJph! %evertheless#
correction time for those ugs is contained in those B3<ph# ut such effort should not e
accounted for computing the techniGue efficiency! For instance# how much effort did the team
spend on manual testing without finding any ugsL &hat effort should e accounted when
computing the efficiency of manual testing# as it is part of the effort allocated to that techniGue#
even though unproductive!
Based on these results# management decided to organiEe some training courses concerning
testing techniGues capale to target data-related defects! "lso# they decided to encourage a
finer and more careful data collection in terms of effort# ecause it represents one of the most
important decision drivers! Finally# in order to address reGuirements issues# the management
decided to hire a new reGuirements specialist and to acGuire oth a reGuirements management
tool and a support tool for specific for reGuirements reviews!
$n conclusion# using data collected overtime enaled the construction of impact vectors# whose
main dimensions consists of those attriutes that enale modeling the capaility of each
detection techniGue an process phase to find defects of certain defect types! Such modeling of
the process supports an easy and Guick derivation of recommendations on how to target
'32 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

inefficiencies of the process and identify resolution strategies! Such resolution strategies were
not direct outputs of the impact vector modelF in fact# even though we would like our
framework to provide at least some alternatives to the decision makers# we have not achieved
that level of advancement for our framework! %evertheless# the impact vector framework is
already ale to provide a clear# significant and useful unch of knowledge that elicits punctual
issues in the development process# so that decision makers can e supported to actually and
concretely identify appropriate solutions to these precisely identified process issues!
Chapter XX. Selecting Technologies for "ncreasing
Prodcti*it': an E&ample with Mobile Technologies
//%i% "ntrodction
&he company decided to e*plore a new filed# i!e! the moile application development! $n fact#
the company is opening a new ranch in its market to target self-monitoring device# that is
device that transmit data to a moile phone# where an app is installed which elaorates those
data and presents them to the user in an understandale way .i!e! alerts/F apps can also e
distriuted to other memers of the family# possily assigning limited access rights# and to
doctors in order to augment their uEEers and provide Guasi-real-time details on the health
condition of patients reGuiring particular care!
&he general Guestion that this paragraph tries to answer is: what is the right technology to use
for the development of a moile application for given reGuirements and conte*tL So the
company performed a road study to e*plore the most popular and advanced moile
technologies# in order to characteriEe them and e ale to select the most appropriate#
according to specific needs!
$n order to develop a new Moile "pp# in fact# several decisions have to e made! &he first one
is the platform on which the application will run! Such decision is usually made y the
marketing management. 0ery often, more than one platform is chosen# although versions for
the different platforms can e released at different times. $nstances of platforms include# ut
are not limited to:
i2S platforms#
"ndroid platforms!
"s soon as the platforms decision is undertaken# one more very important choice is reGuired,
which matches the aforementioned Guestion: @hat technologies should e used to develop the
given moile appL %ow# the &echnical 6eader has the ownership of such decision! "s we will e
e*plaining in the ne*t Sections# the most known e*amples of such technologies include:
'39 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

'&M6BF
-SSF
ZavaScriptF
Platform Specific Software Development Dit .SDD/# including:
o "ndroid SDD# and
o "pplec 8-odeF
&itanium Moile# and related supports for Moile "pp developmentF
"doec# and related supports for Moile "pp developmentF
Phone,apF
)=ueryMoileF
Sencha &ouchF
Do)oMoile!
$ndeed# even when a single platform is initially targeted to develop an app# few ways e*ist to
enact the development# which can e classified in three Development "pproaches .D"s/:
%ative D": i!e!# pure native development# using the platform specific SDDF
@e D": i!e!# pure we app .'&M6 B allows "pps that are almost as powerful as native
apps/F
'yrid D": i!e!# using a mi*ed approach# as supported y several different
technologies discussed in the remaining of this chapter.
" decision concerning the D" to use has to take into account oth functional and non-
functional app reGuirements# such as the need for different devices on oard .,PS# -amera#
etc/# as well as constraints placed y the selected target platforms# schedule# and udget!
&herefore# a multi-criteria decision process is reGuired and oth tools and frameworks would
desirale to support and guide it!
//%ii% Prposes of the Std' and Research Methods
Evidence-ased research in the field of software development for moile apps is in its
eginning and still needs enhancement! Data and knowledge ases are not yet pulicly
availale! -oncerning private companies# it may e that data collections are not yet capitaliEed
organiEation-wide! "s a conseGuence# it is still freGuent for researchers to start studies y
producing their own data from scratch!
&he research methods that we used for this study are survey and case study research! For data
collection, we used literature and technical documentation survey# interviews to practitioners#
and development of case studies y using and comparing different technologies!
&he purpose of this work is e*ploratory rather than confirmatory X<4YF in fact# we do not aim to
give a final answer to the asic research Guestion# ut we want to check whether an approach
is feasile and capale to support answering the asic Guestion! -onseGuently# in the
'3, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

remaining# no formal goals X<7Y# hypotheses X4:Y or details concerning the utiliEed research
o)ects and processes .case studies# handooks# papers# etc!/ will e presented!
$n general# we do not aim to address the false prolem of defining the Hasolute estI
technology for moile software development! 0ice versa# we remark that our e*pectation is
that no single development approach will fit all needs of all customers! "s Fling oserved X4;Y#
whatever medium we use to address our goals# we have a numer of choices# each with its own
pros and cons! Similarly to all other ranches of Software Engineering# we e*pect some moile
media to e Guick to create apps# ut accessile to a few of the customers andJor delivering
hard to maintain appsF others could address a larger market# ut far more e*pensive and
comple* to use!
'owever# it is clear that companies are facing an ovious trade-off etween user e*perience
and application functionality on the one hand# and development costs and time to market on
the other hand! &herefore# the challenge is to choose a development approach and a
technology that are capale to alance reGuirements with the availale udget and time-to-
market X43Y! "t this point# the fundamental Guestion is: H@hat is the right choiceLI.
$n order to help answer such Guestion# this paper aims to provide
.i/ a guide to technology decision, and
.ii/ a framework to support such decision# which could evolve ased on new
technologies that will come out!
//%iii% Characteri7ing the Technologies to Std'
&he choice of the platform dictates the range of technologies to use for developing an app! $f
we need an application e*pected to e multiplatform# then we have a range of technologies
and possile approaches to use - esides developing the same app for each platform using their
native SDD# - i!e! hyrid or we approaches! -onversely# if we need an app for a single
2perating System .2S/# then we have different choices availale and the right approach could
e the native development!
$n the remaining# we will try to answer the initial Guestion aout moile technologies y
focusing on three Platform -ategories .P-s/:
P-3! i2S platformsF
P-9! "ndroid platforms#
P->! 2ther platforms# which we merge under this category# namely H2thersI!
So# it is fairly intuitive that the platform will e one of the impact vector dimensions!
'3: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"s already mentioned# after the selection of the target platform.s/# the ne*t step consists in
choosing oth the Development "pproach# and the Development &echnology .D&/!
&here are three D"s availale for apps development:
D"3! @e apps
D"9! %ative apps
D">! 'yrid apps
Since they are mutually e*clusive# one dimension is enough to model this aspect via impact
vectors!
-oncerning the D&# it is not independent from the selected D" and P-! $n this chapter# we take
into consideration a limited set of development technologies# among the ones availale on the
market! Specifically# we consider those technologies that the largest software companies
commonly adopt worldwide! @e have already listed such technologies in the previous
paragraph! $n the following# we consider these technologies and group them y the selected
D"!
@e apps: choosing the technology is Guite straightforward:
D&3! '&M6B# -SS# and ZavaScript! Since these technologies fit together# we consider
this set as a single choice!
D&9! %ative apps: the platform specific development kit can e selected# e!g!:
i! "ndroid SDDF
ii! "pple 8-ode!
"dditionally# the following technologies can support the development of %ative Moile
"pps:
D&>! "ppceleratorc+s &itanium Moile#
D&A! "doec Fle*# in con)unction with "doec "ir!
6ast two development technologies can create cross-platform apps from the same
source code!
'yrid apps: Phone,ap can e used# in addition to the following ZavaScript frameworks:
D&B! Phone,ap ? )=uery MoileF
D&;! Phone,ap ? Sencha &ouchF
D&<! Phone,ap ? Do)o Moile!
$n this case# we need to create one Boolean dimension for each technologyF this way# each
technological stack can e represented via an impact vector y means of a it mask# i!e! y
setting to true those technologies that are part of the technological stack and to false the ones
not eing part of the stack! $n conclusions# the Boolean dimensions are: '&M6B# -SS# plain
ZavaScript# %ative# "ppceleratorc+s &itanium Moile# "doec Fle*# "doec "ir# Phone,ap#
)=uery Moile# Sencha &ouch# and Do)o Moile
'4( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

" rief review of the listed technologies# including their pros and cons# is provided in the
remaining of this Section!
!!$iii$%$ =M>/ ? 2SS- ? @avaScript AD%B
Since they are we technologies# the learning curve is e*pected to e very steep .i!e! we
e*pect developers to Guickly get familiar with them/# if compared to the native languages!
"dditionally# when writing an application# multiple platforms can e targeted without
encountering any significant prolems! 2n the other hand# since '&M6B and ZavaScript are not
allowed to access all device features# the developer could encounter some limitations!
Moreover# as an application uilt with such stack has to e interpreted y the rowser#
performance can e worse than an application developed with native languages!
!!$iii$,$ Platform Specific Software Development Cit AD,3-B
$t ensures the full e*ploitation of all features of the platform on which the practitioner decides
to develop the app# in addition to the native look \ feel! &he most ovious downside is the
aility to address only one platformF for this reason# if the organiEation is willing to produce a
multiplatform application# the code has to e rewritten for each chosen platform! &his
contriutes to get high development cost and long developing time! $t_s important to underline
that in the following the D&9 and D&> will e considered together!
!!$iii$-$ AppceleratorD itanium Mobile A%$6$%B AD.B
&itanium Moile is a cross-platform moile development framework that enales developers to
write ZavaScript code# which is then compiled down to native code for the i2S and "ndroid
platforms! $f the organiEation wants to uild a multiplatform application# &itanium Moile can
save time with respect to developing via platform specific SDD! "dditionally# a real native
application is otained# with the platform-specific look \ feel!
!!$iii$.$ AdobeD Fle& Mobile A.$/B ? Adobe Air AD/B
Fle* is another cross-platform moile SDD! $t enales developers to write their applications y
using the "ctionScript language for "ndroid# i2S# and Blackerry platforms! -oncerning i2S# it
generates a pure native application written in 2)ective -! -oncerning "ndroid and Blackerry#
the developer has to install a further layer on which the application will run: the "doe "ir
Environment! "pplications developed y Fle* are deployale on the most important app stores!
!!$iii$/$ Phone9ap A%$4$8B AD03 43 6B
&his technology enales developers to create a we application wrapped into a native
container! &his means that developers can write an application once# and# when they need to
migrate to another platform# they can reuse the code y wrapping the initial we app into
'4' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

another native ad-hoc wrapper! Since oth wrappers offer the same interface to developers# no
changes are reGuired to the previously written code! Developers still have to face the cons
associated to using '&M6B# -SS# and ZavaScript .see Su-Section " aove/# ut at the same time
they receive some of their enefits# including a steep learning curve# the possiility of deploying
the app on appstores# and the chance of e*tending the native wrapper to use device features
not availale in the default '&M6B!
Since Phone,ap allows to develop pure we apps# selecting a ZavaScript framework appears to
e an essential choice! $n this paper# we take into account three main ZavaScript frameworks#
among the ones availale on the market# i!e!: )=uery Moile# Sencha &ouch# and Do)o Moile!
)=uery Moile seems to us to e a good candidate for very asic and logically simple
applications# while Sencha &ouch# due to the different development approach# has to e
preferred when the organiEation is willing to leverage more comple* development patterns#
such as the Model-0iew--ontrol! $t seems to us that this l
st
technology shows a very flat
learning curve# ut it gives practitioners the possiility to keep under management the
development of even the more comple* applications! Do)o Moile seems to us to e the most
complete technology# among the three considered aove# as it provides oth development
approaches offered y )=uery and Sencha! 'owever# if a simple usiness logic# small footprint#
and very small time to market are needed# )=uery Moile could e the right choice!
//%i*% Dri*ers for Technolog' Selection
$n our study# the technology selection is driven y a main set of reGuirements# grouped y the
following two categories:
.i/ %eeds
.ii/ Device features!
Based on literature X49# 4># 4A# 4B# 4;# 4<# 44# 47# 7:# 73Y# we can reakdown these categories
as in the followings! %eeds represent oth functional and non-functional reGuirements! %eeds#
or# eGuivalently# %on-Functional Drivers .%FDs/ are:
%FD3! "ccess to native hardware featuresF
%FD9! 'igh performanceF
%FD>! -ross platformF
%FDA! Easily upgradealeF
%FDB! Deployale on app storesF
%FD;! Small footprintF
%FD<! =uick coding and prototypingF
%FD4! -omple*
%FD7! Simple usiness logicF
'42 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

%FD3:! -ustom look \ feelF
%FD33! Platform look \ feelF
%FD39! -heap Development cost!
Device Features .DFs/# as commonly found on moile devices# include:
DF3! "ccelerometerF
DF9! -ompassF
DF>! 2rientationF
DFA! 6ightF
DFB! -ontactsF
DF;! FileF
DF<! ,eolocationF
DF4! MediaF
DF7! %etworkF
DF3:! %otification "lertF
DF33! StorageF
DF39! MultitouchF
DF3>! SMSF
DF3A! BluetoothF
DF3B! 0ideo -apture!
Both %FDs and DFs are part of the impact vector dimensions!
XX.v. $ Model for E*alating Mobile Technolog'
Based on the content of previous paragraphs# answering to our asic Guestion d @hat is the
right technology to use for the development of a moile application for given reGuirements and
conte*tL d means to solve a prolem in a space made y forty dimensions: > related to P-# > to
D"# < to D&# 39 to %FD# and 3B to DF! &hree of these dimensions# the P--related ones# are
alternative dimensionsF the same holds for the three D"-related dimensions! &he twelve %FD-
related dimensions are independent one another# and the same holds for the fifteen DF-related
dimensionsF however# DF dimensions can depend on %FD# D"# and P- dimensions# even though
we might still e unale to e*press# a priori and formally# such dependencies! "dditionally# an
application is reGuired to meet many needs# a device feature can serve many needs# and each
development technology can offer different features, depending on the platform it is used for!
$n other words# we are not coping with a system that we can immediately and easily model via
mathematical functions# assuming it is feasile!
'43 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

&o manage the comple*ity# our decision was to proceed y astraction and classification# and
we eventually consolidated those forty dimensions in five macro-dimensions! &his led to
constructing a five-macro-dimension space of: %eeds# Device Features# Development
&echnologies# Platforms# and Development "pproaches! &he domain of each macro-dimension
includes the items listed in the previous Sections for P-# D"# D&# %F5# and DF! Since we see no
interest in sorting those items# in their domains# they are listed in aritrary order# so otaining a
%ominal scale X4:# 79Y for each dimension!
&his discrete space of five macro-dimension can e easily and automatically generated and
updated whenever new platforms or development technologiesJapproaches appear on the
market or are removed# for any reasons# from the set of accepted candidate
technologiesJapproaches!
2ur ne*t step is to represent: .i/ each technological stack as a point in this space# according to
its capaility to meet some reGuirements .e!g! support for a given device/F and .ii/ the app to
develop as a point in the same space# according to the importance that each reGuirement has
with respect to the app conte*t .e!g! importance of the presence of a given device/!
&his way# a pro*imity measure could e created to Guantify the distance etween the point
representing the app and each point representing a technological stack! &he technological stack
which minimiEes the distance from the app is the first candidate to e the est fit to the
considered app conte*t! &he feasiility of mapping technological stacks and app reGuirements
as points in the previously defined space can follow the $mpact 0ector ModelF in particular# the
app reGuirements can e modeled as a Hgoal impact vectorI .i!e! the e*pected result of the
development/# while technological stacks can e modeled as process impact vectors .i!e! the
possile cominations of languages# approaches# platforms and tools that can e used for the
development/! For revity# we omit the details of the mapping of app and technological stacks
as points in the space! 'owever# a pro*imity measure still has to e defined# so that a &ech
E*pert can apply our model in a real conte*tF a definition of such pro*imity measure and an
e*ample of use of our model are provided in Section 4!
XX.v.1. #mplementation and 9raphical "iew of the Model
&here are oth numerical and graphical prolems to represent a space with more than three
dimensions! -oncerning the first# discrete dimensions and 2rdinal scales rather than 5eal scales
are possile for the interesting functions# with conseGuent limits on the mathematics that can
e applied when looking for optimal solutions under defined constraints! -oncerning the last#
our decision was to start y using a simple user interfaceF in fact# the focus of this paper is on .i/
e*ploring the moile development technology modeling for decision-makingF and .ii/
empirically verifying models in la and on the field ! &his is the reason why we did not put much
'44 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

effort on uilding up an advanced user interface for our tool prototype! $n particular# the
prototype is ased on a tale .see &ale 3/ which implements the space defined in Section B! $n
order to model the five dimensions in a flat surface K which allows an easy use of the model# K
the item type of such tale .i!e! a point in the space/ represents more than one piece of
informationF specifically# each item of the tale is itself multi-dimensional# so creating a
topological space similar to a manifold X7>Y. Each elementary item of the tale .i!e! each
dimension of a given point/ e*presses a measure in a four-value 2rdinal scale# which are
graphically represented y the symols # # # and , where denotes the null value# i!e!#
H%ot supportedI!
$n practice# the tale represents a comple* ut homogeneous astract entity# that is a B-cue#
whose elements are symols of the given ordinal scale!
&ale 3 is structured in two Guadrants! &hese realiEe two guides for selecting the technology to
use according to: a/ the needs# and / the device-specific hardware features that the particular
technology offers# respectively!
&he first Guadrant implements the suspace .D&# %FD# D"# P-/F the ascissa reports the
development technologies# the ordinate represents the needs! &he type of development
approach# as leveraged y the corresponding development technology# is represented y
different columns and different ackground colors: .i/ @e in the first column .2range/F %ative
in the suseGuent triple of columns .Blue/F and 'yrid in the last triple of columns .,reen/!
Each item shows a value in the given 2rdinal scale! Since the values reported do not depend on
the platforms specificities for each given development technology# such platforms are not
represented e*plicitly! $n practice# this Guadrant represents a cue with colored slicesF each
element of the cue includes as many times the ordinal measure# shown in the first Guadrant#
as the numer of platforms!
&he last Guadrant shares the ascissa .D&/ with the first Guadrant# while the ordinate
represents the device features! Many items in this Guadrant represent arrays of platforms
.rather than any platform as in the first Guadrant/! &he reason is that each technology can offer
different features depending on the platform on which it is used# and a measure is e*pected for
each of those platforms! "gain# the actual platforms that are considered separately are i2S and
"ndroid# whereas the remaining platforms are grouped into the category H2thersI!

'4- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


Figure B; &he technology matri*
&he scale elements assume a slightly different meaning in the two Guadrants! $n the first
Guadrant# the semantic of the scale is N%ull# $nsufficient# Sufficient# E*cellentPF in fact# the
symols indicate the e*tent to which# for the considered development approach# that specific
technology satisfies the corresponding need: insufficiently# sufficiently# and e*cellently!
$n the last Guadrant# instead# the semantic of the scale is N%ull# 5ough# Medium# @ellPF in fact#
those symols indicate whether the specific technology provides "P$s for using the
corresponding device feature, and their level of support: H@ellI# i!e!# supported for all the
different versions of the same platform# HMediumI# i!e!# supported for some ut not all the
'42 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

different versions of the same platformI# or H5oughI# i!e! lightly supported! $n the first Guadrant
we can find only one symol per tale item# whereas in the last Guadrant# as already
mentioned# an item is an array of symols .i!e! a A-dimension point/# and can also include all
symols together .one per point dimension/!
@e filled out the dimensions of the matri* discussed aove y collecting information from
various scientific papers# personal e*periences .first Guadrant/# and technical online
documentation .last Guadrant/! &ale 3 also synthesiEes on the results from this work!
//%*i% The Process of Technolog' Decision Making for Mobile
"s already mentioned# we want to develop a guide to select the est technology to use for the
development of a specified moile application in a given conte*t! For this reason we can say
that our output is the &echnology dimension# while the other dimensions of &ale 3 are our
inputs!
Proaly# a multivariate analysis should e used for identifying the est technology to e*ploit
under the given reGuirements and constraints! 'owever# at this stage of our work# since there is
no way to apply multivariate with the small and artificial dataset availale# our decision was to
start y using simpler techniGues!
$n general we have two kinds of reGuirements: ,eneral 5eGuirements .,5s/# which are present
in any moile app# and Specific 5eGuirements .S5s/# which are e*plicitly stated for the current
app# and are sutypes of the ,5s!
$n order to enact a technology selection process# the &ech E*perts are reGuired to assign a
weight to each ,5# ased on its importance or relevance for any app to develop# and then to
give a score to each reGuirement! -oncerning the latter# let % e the numer of points to
distriute across needs and device features! $n the setting that we used for our approach# 9BV
of such points are automatically and evenly assigned to the ,5s# and the remaining <BV of the
% points are left for assignment to &ech 6eadersF these will assign them as they prefer to the
S5s# ased on their assessment of the reGuirement relevance for the app eing developed! &he
initial 9BV assignment is made to diversify the otained results# without losing the
contriutions that other features give to the decision making! &his point distriution can e
changed aritrarily# without affecting the model validity!
Finally# &ech 6eaders enter &ale 3 and otain# as a result# a value in the previously defined
2rdinal scale for each technology! $n order get more manageale results# they can make the
further decisions of translating those values in the notation of 5eal numers# e!g!# y using a
locally defined translation mapF this allows to switch from an 2rdinal scale to a 5eal
measurement model# which enales to apply the algera of 5eal numers! "t this point# each
'49 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

generated numer represents a numerical score which should Guantify numerically the
capaility of each technology to fulfill any given reGuirement!
@e are aware that the mentioned scale transformation is a theoretical and practical haEard X79#
7AY! 'owever# also in this case# the $mpact 0ector Model supports us to comine and manage
heterogeneous dimensions!
//%*ii% Case std'
"s an e*ample of using our matri* for decision making# let us consider a synthetic version of a
case study we conducted! 6et us suppose that a &ech 6eader is reGuested to develop an
application with the reGuirements shown in the &ale B# row 3!
&ale B E*ample of reGuirements
Description &he app is reGuired to allow an user to chat with other people connected to
network y the same application! Moreover# the app is reGuired to provide the
user with the aility of filing and sending recorded audio clips and video
messages! Furthermore# the app is reGuired to allow an user to e*tend herJhis
contact list y specifying the telephone numer of the person sJhe wants to
includeF herJhis moile phone list of contacts is e*pected to e allowed as a
source for such a phone numer!
5eGuirements "vailale on "ndroid and i2S
"ccess to media
"ccess to videocamera
"ccess to local files
"ccess to contacts list
%otification "lert
'igh performance
Deployale on app stores
-heap development costs
"ccess to network

&en features are e*plicitly stated as significant for this application# as shown and numered
from 3 to 3:! 6et the &ech 6eader assign the same weight to all general reGuirements .,5s/# and
assuming %M3:::# to distriute the remaining <B: points on each specific feature .S5s/ in the
following way:
3! 3:: points# 9! >: points# >! 3: points# A! ;: points# B! 3B: points#
;! >: points# <! 4: points# 4! <: points# 7! <: points# 3:! 3B: points!
'4, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"dditionally# in order to have manageale results# let the &ech 6eader map the given 2rdinal
scale into a 5eal scale# as e*plained in the previous Section# y using the following map:
E*cellent: 9!: points! Sufficient: 3!: point! $nsufficient: :!B points! %ot Supported: 3!: point!
&he &ech 6eader can now enter the matri*# and otain an ordinal score for each technology!
SuseGuently# he can enter the map# and translate the ordinal symols found into 5eal
numersF eventually the results shown in Figure B< will e otained!

Figure B< -ase study results
"s we can see# the two technologies with the highest scores are:
3! Platform Specific Development Dit with 3;<B!7 points!
9! "doe Fle* Moile ? "ir with 3A73!:B points!
For the given conte*t# the more meaningful definition of pro*imity measure is the vector
distance from the optimal vector .i!e! the resulting difference vector/! &he optimal point is the
one with ma*imum achievale score# which is 9::: points .i!e! e*cellent in every dimension/#
which e*actly matches the e*pected characteristic of our app! $n practice# for our conte*t#
minimiEing such distance is eGuivalent to pick the ma*imum among the computed technology
scores! Based on the scores shown in the &ale ># the est technological choice for this
'4: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

application should e the native approach with the Platform Specific SDD! " good choice would
e also the "doe Fle* Moile ? "ir# in fact# if there are heavy constraints on the time to market
and there is not enough in-house skills for each platform .which is a characteristic not
represented in our model/# such technology could e the only feasile way!
$n oth cases# however# developers are given all the information reGuired to make an informed
choice on what technology est fits the application to develop in the given conte*t!
//%*iii% #alidit' "sses1 Soltion ;imitations1 and ;esson ;earned
&he model proposed in the previous paragraphs was populated y interviews with e*perts of#
and tested in four case studies within# the $BM $taly 5ome Smart Solutions 6a! Such case
studies were concerned with different prolems to address and they covered a wide range of
possiilities# ut in a limited numer of domains .e!g!# moile anking# education# and moile
marketing/!
2nce minded that we are coping with an e*ploratory study# ased on feedack that we had from
practitioners# the solution validity of the study# i!e!# its internal validit0# can e considered high
enoughF on one side# this is concerned with the solution u"efulne"" .does it address relevant
practical prolemsL/# which was very positively evaluated y the involved e*pertsF on the other
hand# it is concerned with the goodne"" .how well the solution is constructed/F in the opinion of
the same practitioners# practical and actionale results were produced y the study# even in
asence of a strong and theoretically rigorous conte*t!
$n our view# the proposed solution is porta#le to# and promises to e effective in# different
conte*ts and cases! $n other words# the e*ternal validit0 of the proposed solution is consideraly
high! 'owever# due to the e*ploratory nature of the study# conseGuential limitation" should e
taken into consideration!
First of all# it is important to remark that interviews# as source of collected data and knowledge#
involved Guite a limited numer of participants in the role of moile technology e*perts! "lso#
only one su)ect developed the four case studies that we have een performing! "dditionally#
the proposed model was piloted with a single organiEation! &his increased the threat of having
the same person repeating all positive or negative actions from case to case# and people doing
the technology evaluation in the same conte*t!
2ne more aspect# which was not considered in this study# and which affects the e*ternal validity
and limits of the proposed solution# relates to the communication model utiliEed y the app to
develop# e!g!# apps connecting more than one partners synchronously andJor asynchronously!
&his includes considering if# and to which e*tent# media-contentJsocial networks would influence
'-( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

the development of moile apps# and how this should e modeled y an enhanced version of
the proposed solution!
"gain concerning threats on validity and limits of the proposed solution# we notice that our
model is oriented to supporting pro)ect leaders in decision making# ut it is ased only on
technical attriutes! -onstraints placed y the strategic management should e introduced in
the model# to support technicians to undertake value-ased decisions X7BY# including risk and
return on investment# and proceed coherently with the strategic goals of their company!
" further relevant aspect is that the information gathered during interviews was mainly
Gualitative knowledgeF additionally# measures in 2rdinal scale# as produced from case studies#
were eventually influenced y the su)ectivity of the involved e*perts!
$n our understanding# in case of e*ploratory studies# which is our case# it is reasonale to accept
all the aforementioned validity threats# given the mitigation strategies we enacted# as usual in
the e*perimentation in the field of Software Engineering X4:Y!
For what concerns the lesson learned# let us recall that interviews with e*perts helped populate
the matri*! 'owever# as the case studies proceeded# we had to include additional attriutes in
the model .i!e! dimensions in the space/ and refine the seeded values! &his confirmed our
conviction that an iterative-incremental approach should e enacted to define the model and
populate its ase of knowledge! Following the initial training of the data and knowledge ase# as
enacted y technology e*perts# the enrichment of the ase should e enacted continually# ased
on decisions made y the organiEation+s &ech 6eaders# possily leveraging a structured
methodology# e!g! the =uality $mprovement Paradigm X7;Y! "dditionally# each development
organiEation might need to manage its own ase of knowledge for moile apps! &hese solutions
should also help solving e*pected dependencies of the model from the conte*t of the
organiEation# e!g!# y realiEing organiEation-wide dynamic ases of knowledge refined y domain
.moile health# moile education# etc!/!
Chapter XXI. E&ploiting Software "nspection Data for
Earl' Defect Remo*al
2n the path of evaluating the progress $mpact 2ne is making with process improvement via
impact vectors# in this chapter we want investigate how inspections are performing with
respect to defect removal capaility and we suppose that several years are gone since
inspections were adopted!
'-' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

" long history of e*perience and e*perimentation has produced a significant ody of knowledge
concerning the proven effectiveness of software inspections! Data and e*perience from many
years and many types of organiEations have shown that a properly conducted inspection can
remove etween ;:V and 7:V of the e*isting defects X7<Y! $t is well estalished that
inspections in the earliest phases of software development yield the most savings y avoiding
downstream rework!
'owever# the value of inspections varies widely oth within and across organiEations!
$nspection effectiveness and efficiency can e measured in numerous ways .defects found#
defects slipped to testing# time spent# defects found per unit of effort# etc!/# and may e
affected y a variety of factors# some related to inspection planning .e!g! numer of inspectors/
and others related to the software .e!g! languages used for the artifacts/ and the type and
structure of the development organiEation! &he work descried here is ased on an analysis of
a large ody of data collected from inspections at $mpact-2ne!
Software engineers at $mpact-2ne Guickly realiEed that inspection planners would enefit from
guidance on how to manipulate the variales they could control to ma*imiEe the performance
of the inspection! &he result was the definition of a set of heuristics ased on data from a large
numer of inspections they conducted! &hese heuristics have een widely disseminated across
all the $mpact-2ne teams and provide guidelines for the following factors:
Tea; siLe# the numer of participants involved in the inspectionF small teams are likely to
lack important perspectives# while larger teams are more likely to e*perience dynamics that
limit full participation!
MeetinC lenCthF if meetings stretch on too long# memers+ energy is likely to flag and the
results are likely to e less than optimal! @hen an inspection reGuires a ig amount of work#
it is recommended that additional meetings e scheduled!
PaCe rate3 the numer of document pages that the inspectors e*amine per hour of the
meetingF giving a team too much material to look through will invarialy result in a more
superficial inspection!
'euristics may vary ased on the type of artifact eing inspected# which represents a factor of
mainly importance during inspection planning! Based on the repository availale to us# we
identified four artifact types: reGuirements# design# code# and test! For each artifact type# an
analysis and synthesis of e*isting heuristics and studies at $mpact-2ne led to the values
reported in &ale ;!
TaDle 2 Current >erive> heuristi=s for ins<e=tions
Artifa=t Tea; siLe MeetinC lenCth PaCe rate
5eGuirements XBF 4Y b9 X3:F >:Y
'-2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Design XAF <Y b9 X9:F ABY
-ode X>F ;Y b9 X3>!>F 9:Y
&est X>F ;Y b9 X9:F ABY

$n the formulation and use of these heuristics# the outcome variale of interest has een the
total numer of defects found during an inspection! &here have een oth historical and
practical reasons for this focus! During inspection planning# the total numer of defects in the
software to e inspected is unknown! $t is also# practically speaking# unknowale# in this
environment# due oth to the variety of verification and validation activities applied to the
software# and the difficulty in reconciling data from these activities with inspection data! &hus#
managers and inspection planners have focused on ma*imiEing the numer of defects found in
an inspection# and using that as a enchmark for comparing the effectiveness of different
values for the inspection planning parameters .i!e! team siEe# meeting length# and page rate/!
$n this chapter# we investigate whether# and to which e*tent# complying with the given heuristic
actually turns into enefits in terms of the stated goal of ma*imiEing the numer of detected
defects# i!e!# the efficacy of the inspection!
"s we will e showing# compliance with the heuristics has a significant positive impact on the
performance# i!e!# the effectiveness# of the inspectionF therefore# management should reaffirm
and encourage inspection planners to use the provided guidelines! 2n the other side# as a result
of our study# we found that# in some conditions# compliance with those heuristics is not
necessarily the est decision driver# ecause more compliance sometimes may cause higher
cost with no practical enefits!
$n the rest of this chapter# we descrie the state of the practice of inspections at $mpact-2ne
and make e*plicit the research Guestion we address! &hen we present the results of our
analysis# along with the methodology we created and the decision model we uilt in order to
package our current knowledge on inspections at $mpact-2ne! "fterward# we will discuss the
threats of our empirical investigation and eventually draw some conclusions!
//"%i% "nspections at "mpact,)ne: State of the Practice
$n order to investigate the performance of e*isting heuristics for inspection planning# we
accessed a large repository containing more than 9B:: records# each representing an
inspection in terms of artifact# team siEe .&S/# meeting length .M6/# numer of detected defects
.DD/ and more attriutes! Data were collected across all teams centers over several years# ut it
'-3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

is important to remark that more recent inspections ehave eGuivalently to older inspections in
terms of validity of the heuristics# which is the focus of our analysis!
-ompleteness of records represented an issue we had to cope with# as some fields were
sometimes left lank! @e created a list of e*clusion criteria to apply to the repository in order
to otain usale data! $n particular# a record should e filtered out when:
the numer of detected defects is unknown .it does not provide useful information/
the numer of participants is:
o unknown .level of compliance# 6-# unknown/
o 3 .no inspection meeting# therefore no structured inspection/
the meeting length is
o : .no inspection meeting# therefore no structured inspection/
o unknown .6- unknown/
page rate is unknown .6- unknown/
product type is HotherI .heuristics are provided on a y-artifact asis/
"fter filtering# the numer of remaining records was ;33# structured as follow: 3B on
reGuirements# 37: on design# >:9 on code# and >; on test artifacts! "dditional statistically
relevant information is reported in &ale <!
TaDle 9 es=ri<tive statisti=s
Tea; siLe MeetinC lenCth PaCe rate ete=te> >efe=ts
Mean >!<9A :!44B B7!9A3 4!A:>
Median >!::: :!B:: >9!4;< A!:::
Mode >!::: :!9B: 9A!::: 3!:::
Std! Deviation 3!A4B :!<B: 39<!;93 3:!:A;
Minimum 9!::: :!3:: :!9;< :!:::
Ma*imum 4!::: A!::: 9A9>!;:: B7!:::

&he analysis of such a large repository revealed that the higher the compliance with heuristics
.see &ale ;/# the higher the numer of defects detected during the inspection .see &ale 4#
columns 3 and 9/! From further analysis# it is also noticeale that only b3:V of teams apply the
inspections at the highest compliance with the given heuristics .&ale 4# columns >/ and that
almost two thirds of the inspections have een performed complying with one or Eero
heuristics# which correlates to a lower numer of detected defects!
'-4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

TaDle , )ehavior of tea;s Aith res<e=t to heuristi=s
"u;Der of heuristi=s the
ins<e=tion =o;<lies Aith
AveraCe nu;Der of
>ete=te> >efe=ts
Sa;<le siLe Ratio of
ins<e=tions
> 37!7;B ;A 3:!A<AV
9 39!A43 3AB 9>!<>9V
3 ;!:<; 9;; A>!B>BV
: >!3A7 3>; 99!9B7V

Figure B4 offers a graphical representation of the same information# where the siEe of the
spheres represents the numer of inspections performed at the corresponding level of
compliance .i!e! the sample siEe/!

FiCure -, )ehavior of tea;s Aith res<e=t to heuristi=s
&hese Gualitative considerations are supported y our statistical analysis# which we now
synthesiEe!
6et us primarily define the metric H6evel of -omplianceI!
Def: ,iven an inspection on a particular artifact# its 6evel of -ompliance# 6-# is the numer of
inspection attriutes .i!e! team siEe# meeting length and page rate/ compliant with the
heuristics for the corresponding artifact type! &he level of compliance for an inspection is n
when e*actly n attriutes are compliant with the heuristicsF therefore# n N:F 3F 9F >P! $f some
parameters are unknown# then the level of compliance is undefined!
'-- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$n the dataset we used for the analysis# we utiliEed the numer of detected defects which
records are complete# i!e!# no parameters is unknown!
&he first step of the statistical analysis is to run the ZonckheereK&erpstra test .or Zonckheere+s
trend test# non-parametric/: given an ordered list of levels for the factor .6- in our case/# the
ZonckheereK&erpstra test can detect the e*istence of a monotonic increasing trend in the
response variale when the factor increases! More specifically# the test re)ects the null
hypothesis if at least one pair in this list of levels shows a statistically significant difference on
medians with the ne*t level! $n our case# the null hypothesis of the ZonckheereK&erpstra test is
re)ected with p-value smaller than :!::3F this means that increasing 6- leads to increasing the
numer of detected defects!
SuseGuently# we investigate each pair of consecutive 6-s# in order to identify which pairs are
significantly different! 5esults of this set of > comparisons are reported in &ale 7: samples are
all normally distriuted and independent# thus we ran A unpaired t-tests!
TaDle : Statisti=al analFsis results
M Co;<arison <0value SiCnifi=an=e
3 6-M: vs 6-M3 S :!::3 (
9 6-M3 vs 6-M9 S :!::3 (
> 6-M9 vs 6-M> :!::3 (
"ll tests detect the e*istence of significant differences etween pairs! &herefore# it is evident
that teams should e encouraged to increase their compliance with heuristics# if they aim to
detect the highest numer of defects! 2n the other hand# complying with the heuristics often
represents a cost .e!g! increasing the numer of inspections or stretching the meeting length/!
"lso# in some cases# constraints may e*ist limiting the possiility of complying with all
heuristics!
'ow can inspection planners identify the tradeoff etween cost and complianceL Since
compliance has een shown to correlate to performance# this pro)ect management Guestion
assumes a particular relevance and# in this paper# we try to address it y answering the
following research Guestion: are the heuristics accurate enough to provide the adeGuate level
of support to inspection plannersL
"s a result of our preliminary e*amination on the state of the practice# heuristics represent a
good guideline# ut teams seem not to e enaled to e*ploit their entire potentialF in fact# most
of the times they do not .or cannot/ ehave according to them! -onseGuently# in the rest of this
paper# we will try to uild a new and finer-grained decision support for inspection planners that
accounts for oth inspection performance and cost! &his way# we e*pect to provide inspection
'-2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

planners with heuristics at the adeGuate accuracy to ma*imiEe the numer of detected defects
in respect of the conte*t constraints!
//"%ii% Decision Model: Statistical $nal'sis
!!#$ii$%$ Metrics
$n addition to the previously defined H6evel of -omplianceI .6-/# we define the metric H&ype of
-omplianceI .&-/!
Def: ,iven an inspection# its &ype of -ompliance# &-# is the set of inspection attriutes
compliant with the heuristics!
%otice that# for the same 6-# multiple types of compliance can e*ist! $n particular: for 6-M3# an
inspection can comply with the heuristic on &S# M6 or P5 .mutually e*clusive/F for 6-M9# it can
comply with the heuristics on &S and M6# &S and P5# or M6 and P5!
!!#$ii$,$ Statistical ests
$n order to define a finer-grained decision model# which is the goal of our analysis# we want to
investigate the differences etween the various types of compliance across the levels of
compliance!
"s a first step# we compare consecutive levels of compliance for all &-sF p-values of the
statistical tests are reported in &ale 3: .significance level is 7<!BV/! Depending on the
normality or non-normality of samples# tests used are# respectively# unpaired t-test .1/ or
Mann-@hitney test .M/!
TaDle '( Co;<arisons DetAeen =onse=utive $Cs
M TC =o;<arison <0value SiCnifi=an=e Test
A NP vs N&SP :!::3 ( 1
B NP vs NM6P S:!::3 ( 1
; NP vs NP5P :!::> ( 1
< N&SP vs N&S# M6P S:!::3 ( 1
4 N&SP vs N&S# P5P :!<>9 % 1
7 NM6P vs N&S# M6P :!9A> % M
3: NM6P vs NM6# P5P :!44> % 1
33 NP5P vs N&S# P5P :!>7> % M
39 NP5P vs NM6# P5P :!9:> % 1
3> N&S# M6P vs N&S# M6# P5P :!:3> ( M
'-9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

3A N&S# P5P vs N&S# M6# P5P S:!::3 ( M
3B NM6# P5P vs N&S# M6# P5P :!:93 ( 1

"s a second step# we analyEe the comparisons etween non-consecutive levels of compliance#
i!e!: 6-M: vs 6-M9 .all > &-s/# 6-M: vs 6-M># and 6-M3 .all &-s/ vs 6-M>! 5esults are reported in
&ale 33! "s well as for the first set of tests# one-tailed Mann-@ithney or unpaired t-test is
used# depending on the normality of samples!
TaDle '' Co;<arisons DetAeen non0=onse=utive $Cs
M TC =o;<arison <0value SiCnifi=an=e Test
3; NP vs N&S# M6P S:!::3 %ot needed# see aA \ a< ( 1
3< NP vs N&S# P5P :!34; % 1
34 NP vs NM6# P5P S:!::3 ( M
37 NP vs N&S# M6# P5P S:!::3 %ot needed# see aA \ a< \ a3> ( 1
9: N&SP vs N&S# M6# P5P S:!::3 %ot needed# see a< \ a3> ( 1
93 NM6P vs N&S# M6# P5P :!::B ( 1
99 NP5P vs N&S# M6# P5P S:!::3 ( 1

$t is interesting to notice that increasing the level of compliance from : to 3 always leads to a
performance improvement .comparisons aA# aB and a;/# as well as increasing it from 9 to >
.comparisons a3># a3A and a3B/! 'owever# this is not true when increasing 6- from 3 to 9#
ecause a statistically significant difference e*ists only for the comparison &-MN&SP vs! N&S# M6P!
$t is also interesting to notice some heterogeneity in results: in fact# increasing 6- does not
systematically lead to an increase in effectiveness# ecause the e*istence of such an increase
depends on the specific types of compliance! $n other words# even though results show that# in
the average# the more compliance leads to higher effectiveness# a more detailed analysis
demonstrates that the level of compliance per se is not the est decision driver one can adopt
in the conte*t under studyF rather# a case-y-case decision could e more appropriate!
"s a third step# we analyEe the comparisons etween different &- within the same 6-!
TaDle '2 Co;<arisons Aithin the sa;e $C
M TC =o;<arison <0value SiCnifi=an=e Test
93 N&SP vs! NM6P S:!::3 ( 1
99 N&SP vs NP5P :!<34 % 1
'-, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

9> NM6P vs NP5P :!::A ( 1
9A N&S# M6P vs N&S# P5P S:!::3 ( 1
9B N&S# M6P vs NM6# P5P :!3:B % M
9; N&S# P5P vs NM6# P5P S:!::3 ( M

&he third and last set of comparisons reveals that# within the same 6-# it does make a difference
the &- selected# i!e! different &-s lead to different e*pected performance!
//"%iii% Decision Model: "nterpretation of Reslts
$n order to provide the interpretation of results in &ale 3: and &ale 33# we propose the
decision model depicted in Figure B7F a state represents a &- and an edge represents a
statistically significant difference etween source and destination! &he direction of the edge
indicates the direction of the difference# with the arrow pointing at the higher performance!

Figure B7 State-ased decision model
'-: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"n inspection planner can easily leverage
the proposed model during the inspection
planning! $n fact# the usual decision
methodology can e applied to set team
siEe# meeting length and page rate of the
inspection to undertake! "fterward# ased
on the proposed attriute values for the
attriutes and according to the heuristics#
the initial state on the decision model can
e identified! For e*ample# given an
inspection on a reGuirements document of
>: pages# for which the inspection planner
proposes to allocate > people and 9 hours#
the initial state# according to &ale ;#
would e &-MNM6# P5P .see Figure ;: and Figure ;3/!
2nce the initial state has een identified# results
of our analysis can e immediately e*ploited y
e*ploring all outgoing edges of the initial state: if
there are no outgoing edges# then the current
inspection attriute configuration ma*imiEes the
e*pected performance! $f there are some
outgoing edges# then any of them can e
selected .e!g! see Figure ;3/F selecting an edge
corresponds# in practice# to modifying the
inspection configuration y changing one or
more attriute values that are not compliant
with the heuristics! By changing such attriute
values to e compliant with the corresponding heuristic takes the inspection configuration to a
new state that# in the average# corresponds to significantly etter performance! From this new
reached state# the same reasoning can e applied to consider further configuration
improvement# whereas the decision maker is willing to invest more resources in order to
increase the likelihood of getting higher performance!
&aking on the previous e*ample# there is an outgoing edge the inspection planner can considerF
such edge goes from &-M NM6# P5P to &-MN&S# M6# P5P: in practical terms# complying with the
heuristic on &S y including at least two more inspectors .from A to ;/ leads to higher e*pected
numer of detected defects for the given inspection!
Figure ;: 5ed-circled state is the identified initial state
Figure ;3 5ed-circled edge identifies a path for increasing
the inspection performance
'2( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

2ur model can e useful also when constraints e*ist on the resources availale to the
inspection planner! For e*ample# given a constraint on the values of the attriutes# if the
admissile values are all not compliant with the heuristics given for the artifact selected for
inspection# then all states including the compliance with that attriute should not e reachedF
in other words# the original decision model can e immediately tailored to the conte*t and
reduced to having fewer states and edges!
-omplementarily# in case the constraint forces
the attriute value to e compliant with the
heuristic# all states not reGuiring such compliance
should e e*cluded y the tailored model!
For e*ample# if the numer of availale inspectors
for the previous e*ample is B# then the state-
ased model to e used is the one in Figure ;9! $n
this case# the configuration proposed y the
inspection planner is the one that ma*imiEes the
e*pected numer of detected defects!
//"%i*% Threats
"ccording to the classification of validity threats in X74Y# a conclusion validity threat we need to
account for is the random heterogeneity of su)ects! $n fact# inspections were performed in
different centers# thus different people with potentially different skills and capailities may
have produced the inspection data we investigated! 'owever# ased on the results reported in
X7<Y regarding an analysis of the same dataset# we have found no systematic ias according to
the center!
$nternal validity of the e*perimental analysis is threatened y the fact that many variales can
influence the inspection performance# ut we investigated only some of them! "s already
mentioned in section 3# many previous investigations identified team siEe# meeting length# page
rate and artifact type as influential factors of an inspection# i!e! variales that strongly correlate
to the inspection performance! "dditional variales can also e relevant# e!g!: language of the
artifact# e*pertise of the inspectors# Guality of the artifact# domain of the system under analysis
and so on! @hen the numer of variales increases# the comple*ity of the study ecomes
prolematic: even in presence of large datasets# as the one availale to us and used for the
present study# an e*perimental investigation accounting for all potentially influential variales
would ecome unfeasile and statistically .and practically/ unreliale! &herefore# the way out
we opted for has een picking the most influential variales and focus the investigation on
them!
Figure ;9 5educed model in case of a constraint on the
numer of availale inspectors
'2' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

-oncerning the e*ternal validity# it is ovious that narrowing the domain limits the e*portaility
of the results to other conte*ts! 'owever# it is remarkale that# even though results are not
applicale to conte*ts different from $mpact-2ne# the applied methodology is totally reusale:
replacing the dataset enales teams in diverse conte*ts to uild a state-ased model similarly
to the one represented in Figure B7 to support inspection planning! $n particular# such a new
model will e ased on different heuristics .according to conte*t-specific guidelines and
procedures/ and will have different edges as a result of the statistical analysisF nevertheless# its
creation and use will not change!
Chapter XXII. )*erall E&perimentation Threats
$n the previous chapters# we reported on several empirical investigations we performed to
evaluate the impact vector model! "s me stated# our astract and synthetic company is a way
of putting together our investigations and creating a leit motiv for showing the potential of our
model! %evertheless# even though we reported validity threats on a per-e*perimentation asis#
there are some more threats we need to report concerning our general approach to validation!
First of all# we did not actually have the chance to validate the impact vector model in its
entirety in a single real company! &his may trigger some douts on the real applicaility of the
model in real conte*ts! 'owever# as we+ve shown# all parts of the model seem sustainale and
feasile in practice# when used in isolationF furthermore we made plans for the e*tensive
application of our model for the ne*t few years in some of those conte*ts and most of them
were positively accepted y our partners! &herefore# we plan to e ale to report on them to
further mitigate the douts on an encompassing and pervasive application of the impact vector
model!
" second threat applies in general to our e*perimentation: each part of the model has een
validated in at most two different situations! "ctually# only leveraging the impact vectors was
validated twice .one for testing improvement and one for inspection tuning/# while the
remaining parts of our framework were the o)ects of single investigations! 2viously# this
represents a limitation of our e*perimentation! 'owever# such types of empirical studies are
e*pensive and risky for the organiEations running themF therefore# they reGuire a lot of
preparation and significant investments! @e struggled and worked hard to e ale to
e*periment every part of the model# ut replication has rarely een an option for us!
" third threat# which correlates to the second# concerns the domains of application: different
parts of our model were validated in different domains! Even though# on one side# this
represent a pro K as it is an indicator of the fle*iility of the model K on the other side it
'22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

represents a threat# ecause we cannot e sure that every part of the model is feasile and
reasonale in conte*ts other than the one where that part was validated! 'owever# during the
empirical investigation reports# we did not need to precisely descrie every single real domain
and conte*t# ut we were ale to conte*tualiEe the e*perimentation within our astract
company! &his is a good indicator of the fact that the model# though fle*ile and tailorale#
does not need to e modified in its foundation to accommodate conte*t-specific needsF
conseGuently# the same approach# the same process# the same tools and the same concepts
can e totally reused across different domains and conte*ts!
SummariEing# we recogniEe that the validation of our impact vector model is still at its
eginning# ut we can also affirm that we earned the confidence of some of our partners and
we will e ale to further investigate the potential of our approach on the field!

'23 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g




Section VI: Section VI: Section VI: Section VI:
Fin#& Tho!(ht, #n Fin#& Tho!(ht, #n Fin#& Tho!(ht, #n Fin#& Tho!(ht, #n
F!t!"e Wo"+ F!t!"e Wo"+ F!t!"e Wo"+ F!t!"e Wo"+

'24 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Chapter XXIII. Final Thoghts
$n this work# we have defined# e*plored and investigated an approach to managing software
development! &his approach is named $mpact 0ector Model and it is part of the $mpact 0ector
Framework# whose definition is also included in this Ph!D! thesis! 2ur approach is evidence-
ased# value-ased and goal-driven! 2ur target audience consists in decision makers at all
levels# as it enales dealing with oth usiness and technical aspects of software development#
and it actually tries to integrate these two perspectives for making decision oth technically
and economically sound and convenient! &he goal of our model is to support decision makers
during their management activities# e!g! planning# monitoring# measuring# predicting# correcting
e*pectations# acGuiring or freeing resources# prioritiEing# selecting# evaluating# assessing and so
on! 2ther than practically usale# we outlined a theoretical foundation for our model# necessary
to give a rigorous and clear structure to e*perimental data that the model reGuires to deal with!
$n conformance with an engineered# scientific# mid-term work# we went through many
iterations of definition-e*perimentation-refinement# with the collaoration of industry and
academia to come up with well-founded and concretely useful artifacts and knowledge! 5esults
encourage us to proceed on our path and to e*pand the oundaries of our work to including
more domains# more processes# more partners# more data and more models! $n the last few
pages of this work# we will sketch out some future and current work that has to e performed
in order to transform the $mpact 0ector Model into a reliale approach to software
development measurement and management!
Chapter XXIV. Ftre !ork
!!#"$i$%$ 2onsolidating the #mpact "ector Framework
"s shown in Section $0# we were ale to propose a tool chain to support the implementation of
the $mpact 0ectors in practice! 'owever# the tools of the chain are somehow disconnected and
only a wise user willing to leverage the impact vector model can actually find the path through
so many different tools! &herefore# one of the first goals we plan for the ne*t future is to
consolidate the tool chain y providing a set of ridges and connectors to facilitate the
collaoration among those tools! -oncretely# this turns into some technical refinements# e!g!
the migration of $0-Schema from MySGl to DB9 in order to directly support its integration with
-ognos $nsight# ut it also reGuires some more sustantial intervention! Proaly the hardest
and e*pensive task will e the e*tension of 5ational Method -omposer to support the use of
$mpact 0ectors in the $BM vision of software development process management! Since that tool
is commercial and continuously evolving# the realiEation of a similar e*tension very likely
'2- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

reGuires the contriution of the vendor itselfF with this aim# we have already initiated to
e*plore the feasiility of this task and received some positive feedacks!
!!#"$i$,$ #mpact "ector Model and Empirical Software Engineering ,$8
For many years empirical software engineering# y leveraging a numer of different analysis
and synthesis techniGues# has een supporting practitioners and researchers to assess# verify#
Gualify# and improve software processes# products and technologies! "ggregating data from
many e*periments# performed in different conte*ts# with different methods and o)ects# and y
different people# has always een a challenge! E*amples are -eBase .no longer ongoing/ and
Promise# repositories of Empirical Software Engineering dataF the goal of such repositories was
to enale gloal sharing of data to e*plore and analyEe# in order to monitor# plan and predict SE
practices! Even though the importance of empirical data is widely recogniEed# many times they
are not shared and sometimes not even e*ploitedF there might e many reasons to why such
gloal approaches may not work# e!g! the sensitivity of industrial data or the lack of incentives
to stimulate data sharing! 'owever# maye pushed y some romantic vision# we elieve that
sharing data is a moral duty of companies operating world-wide and advancing the state of the
practice with their everyday workF also# researchers# with the same spirit# should e natively
encouraged to share their results and provide more and deeper insights in the research
ranches they investigate!
Based on this# and on the path lightened y &im MenEies of Empirical Software Engineering 9!:#
our contriution is to provide a fle*ile# e*tensile# tailorale# formally well-founded structure
of the data to e shared among researchers and practitioners! "s depicted in the Section $0# we
are uilding a platform to encourage and guide researchers and practitioners to contriute to
advance the Empirical Software Engineering y using the $mpact 0ector Model to organiEe and
share data in a reusale and valuale way! $n the short future# we plan to deploy $0-Schema#
the support tool we are uilding# and enale people to contriute to advancing the state of
Software Engineering!
!!#"$i$-$ 'usiness Anal)tics and the #mpact "ector Framework
@hile gaining confidence with $BM -ognos $nsight and its capailities# we found that in the area
of Business "nalytics# and# more specifically# in the ranch of Business $ntelligence# many
people are elaorating and working on a huge amount of economic data! &heir tasks often
reGuire tuning and optimiEation of usiness processes as the result of their Guantitative and
Gualitative analyses of the availale data! 2ften# their goal ends up eing the ma*imiEation of
the profit# ut inherently it consists of a numer of su-goals that could e rought ack to
something similar to the ones faced y Software Engineers during process planning and
software development process performance enhancement! @e elieve that a careful
'22 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

investigation of similarities and differences etween the two areas .usiness analytics and
software development process enhancement/ might give irth to interesting synergies# as well
as e*change of opportunities# ideas and methods in oth directions! @hat we also consider an
indicator of this potential is the field of Business Process Modeling .BPM/# which is currently
eing paid attention y the 2M, .as an e*ample# see standard on BPM%/! BPM aims at
modeling usiness processes not strictly related to software! 'owever# looking at BPM
representations .e!g! BPM%/ and the process it is applied through# it is evident to any decent
Software Engineer that there are similarities# other than an ovious difference in terms of focus
etween BPM and software development process .trivially# the one on usiness management#
the latter on software development/! Zust to mention one# in oth cases some reGuirements
.usiness or software/ are identified and defined# then graphically represented .e!g! BPM% or
1M6/# and then transformed in something e*ecutaleF one more interesting aspect is the
e*istence of 1M6 profiles for BPM%# as well as the interest of the same computer-related
entity# i!e! the 2M,# on oth themes! Such clues are valuale to us and encourage us to e*plore
also the BPM field# in order to detect potential synergies that may lead to mutual enefits!
!!#"$i$.$ Further 2onte&ts of Application of the #mpact "ector Framework
$n the present work# we mostly reported on verification tasks! &his was not entirely a choice#
ut we were driven into that y our industrial partners# their needs and our current personal
fields of interest! %evertheless# we also sketched out something concerning tasks other than
validation-related .see Section $0/! $n the ne*t future# we plan to e*pand the field of application
and# in particular# of e*perimentation of $mpact 0ectors to including many more activities!
"lways within the group of software verification and validation activities# assurance cases and
the road su-ranch of more formal methods represent an good challenge and an interesting
path to e*plore! 2utside verification and validation# instead# risk management and
reGuirements management activities are our ne*t targets: which attriutes should e used to
accurately characteriEe the su-process of reGuirements managementL @hich ones for risk
management and mitigationL
"lso# do we need to refine the $mpact 0ector Model to accommodate the needs of peculiar
processes# e!g! the development of uniGue pieces of software# like flight softwareL 'ow can we
leverage the $mpact 0ector Model# if any feasile# to support the work of teams with specific
tasks or using particular processes# e!g! =uality "ssurance people or "gile teamsL
&hose are open Guestions that will proaly and hopefully give us much to work for the ne*t
few years!

'29 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g


(eferences
References in $lphabetical )rder
X>;Y "delnai# C!# -antone# ,!# -iolkowski# M!# 5omach# '!D! -omparing -ode 5eading
&echniGues "pplied to 2)ect-2riented Software Frameworks with 5egard to Effectiveness and
Defect Detection 5ate! $n Proceedings of $SESE 9::A# 9>7-9A4# 9::A!
X;BY "ishman B!# 'ormone 5efractory Prostate -ancer - -omined 'ormonal
"lationJ-hemotherapy for 'igh 5isk Prostate -ancer!
http:JJwww!hrpca!orgJchemowaearly!htm# 9::9!
X<BY "ndrews Z! '!# Briand 6! -!# 6aiche (!! $s mutation an appropriate tool for testing
e*perimentsL Proceedings of the 9<th international conference on Software engineering# 9::B!
XB7Y Bansal P!# Saharwal S!# Sidhu P!# "n investigation of strategies for finding test order during
$ntegration testing of o)ect 2riented applications! $n Proceeding of $nternational -onference
on Methods and Models in -omputer Science# 9::7!
X<7Y Basili 0!5!# @eiss D!: " methodology for collecting valid software engineering data! $EEE
&ransactions on Software Engineering 3:.;/# 374A!
X3;Y Basili# 0!5!# Boehm# B!# -eB"SE Software Defect 5eduction &op 3:-6ist! $n Software
Engineering# y Sely# 5!@# 9::<!
X<Y Basili# 0!5!# -aldiera ,!# -antone ,!# " 5eference "rchitecture for the -omponent Factory#
"-M &ransactions on Software Engineering and Methodology .&2SEM/# vol! 3.3/: B>-4:# 3779!
X4Y Basili# 0!5!# -aldiera ,!# 5omach '!D!# &he E*perience Factory# Encyclopedia of Software
Engineering# pp! A;7-A<;# Zohn @iley \ Sons# $nc!# 377A!
X9BY Basili# 0!5!# ,reen# S!# 6aitenerger# 2!# 6anuile# F!# Shull# F!# Serumgfrd# S!# CelkowitE#
M!0!# &he empirical investigation of Perspective-Based 5eading! $n Empirical Software
Engineering! 3# 9# 3>>-3;A# 377;!
X>4Y Basili# 0!5!# Sely# 5!@!# -omparing the Effectiveness of Software &esting Strategies! $n $EEE
&ransactions on Software Engineering!3>#39#39<4-397;# 374<!
X;Y Basili# 0!5!# &he E*perience Factory and its 5elationship to 2ther $mprovement Paradigms# in
6ecture %otes in -omputer Science <3<# Proceedings of ESE-_7># Springer-0erlag# 377>!
'2, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X7;Y Basili# 0!5!: =uantitative evaluation of software engineering methodology! Proc! First Pan
Pacigc -omputer -onf!# Melourne# "ustralia# 3:-3> Septemer# 374B!
XA3Y BeiEer# B!# Black-Bo* &esting: &echniGues for Functional &esting of Software and Systems#
377B!
X97Y Biffl# S!# 'alling# M!# Software product improvement with inspection! " large-scale
e*periment on the influence of inspection processes on defect detection in software
reGuirements documents! $n Proceedings of the 9;th Euromicro -onference# 9:::!
X7BY Boehm B!@!: 0alue-ased software engineering: 2verview and agenda! &ech report# 1S--
-SE-9::B-B:A# 1niversity of Southern -alifornia# Park -ampus! 6os "ngeles# -"# 1S"# 9::B!
X93Y Briand# 6!-!# 1sing Simulation to Empirically $nvestigate &est -overage -riteria Based on
Statechart! $n Proceedings of the $nternational -onference on Software Engineering 9::A!
X3<Y Bush M!: $mproving software Guality - &he use of formal inspections at the Zet Propulsion
6aoratory! %"S" &53773::>B3B7 377:! http:JJntrs!nasa!govJsearch!)spL5M3773::>B3B7!
XAY -antone ,!# H" Measure-driven Process and "rchitecture for the Empirical $nvestigation of
Software &echnologyI# Zournal of Software Maintenance# 9:::!
X>Y -antone ,!# DonEelli P! H,oal-oriented Software Measurement ModelsI# in HPro)ect -ontrol
for Software =ualityI# Editors# 5! Dusters# "! -owderoy# F! 'eemstra# and E! van 0eenendaal#
Shaker Pulishing# 3777!
X7AY -antone ,!# DonEelli P!: Production and Maintenance of Software Measurement Models!
Zournal of Software Engineering and Dnowledge Engineering# 0ol! B# 3774!
X79Y -antone ,!# DonEelli P!: Software Measurements: from -oncepts to Production# &!5! $ntl!
Software Engineering research %etwork# $SE5% &!5! 7<-9<# 377<!
X><Y -antone# ,!# "dulnai# C!"!# 6omartire# "!# -alavaro# ,! Effectiveness of code reading and
functional testing with event-driven o)ect-oriented software! $n 6ecture %otes in -omputer
Science! 9<;B# 3;;-379# 9::>!
X74Y -hillarege# 5!# Bhandari# $!S!# -haar# Z!D!# 'alliday# M!Z!# Moeus# D!S!# 5ay# B!D!# @ong# M!-
(!# 2rthogonal Defect -lassification: " concept for in-Process Measurements! $n $EEE
&ransactions on Software Engineering! 34# 33# 7A>-7B;# 3779!
X7Y -MM$: http:JJwww!sei!cmu!eduJcmmiJinde*!cfm
X;;Y -raighead @!E!# -raighead 6!@!# Manual-Based &reatments: Suggestions for $mproving
&heir -linical 1tility and "cceptaility! -linical Psychology: Science and Practice! B# ># A:>KA:<#
3774!
'2: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X73Y Dingsor "!: @riting a 'yrid Moile "pllication with Phone,ap and the Do)o &oolkit# $BM!
http:JJpulic!dhe!im!comJsoftwareJdwJwe9moileJ:<:<9:33hdingsorJhellohyrid!pdf#
9:33!
X<;Y Eclipse 'elios: http:JJwww!eclipse!orgJheliosJ
XA<Y ElerEhager F!# Minch Z!# 5omach '! D!# Freimut B!,!# 2ptimiEing cost and Guality y
integrating inspection and test processes! $nternational -onference on Software and System
Process# 9:33!
X;9Y ElerEhager F!# Minch Z!# %goc %ha 0!# " systematic mapping study on the comination of
static and dynamic Guality assurance techniGues! $nformation \ Software &echnology! BA.3/# 3-
3B# 9:33!
X;>Y ElerEhager F!# 5osach "!# Minch Z!# Eschach 5# $nspection and &est Process $ntegration
Based on E*plicit &est PrioritiEation Strategies! $nternational Software =uality Days -onference!
343-379# 9:39!
X43Y Fling B!: Moile Design and Development: Practical -oncepts and &echniGues for -reating
Moile Sites and @e "pps# 2_5eilly Media# pp! 3>-9<# $SB% 7<4-:-B7;-3BBAA-B# 9::7!
X;4Y ,aard ,! 2! &reatment of Posttraumatic disorder! "merican Psychiatric Pulishing $nc#
9::<!
X4;Y ,oogle $J2: '&M6B versus "ndroid: "pps or @e for Moile DevelopmentL
http:JJwww!google!comJeventsJioJ9:33JsessionsJhtmlB-versus-android-apps-or-we-for-
moile-development!html
X;AY ,upta "!# Zalote P!# &est $nspected 1nit or $nspect 1nit &ested -odeL# $nternational
Symposium on Empirical Software Engineering and Measurement# pp!B3-;:# 9::<!
XA9Y 'alling# M!# Biffl# S!# ,runacher# P!# "n e*periment family to investigate the defect
detection effect of tool-support for reGuirements inspection! $n Proceedings of the %inth
$nternational Software Metrics Symposium# 9::>!
X37Y 'e 6!# Shull F!: " 5eference Model for Software and System $nspections! %"S"
&5::7::A9<74 9::7: http:JJntrs!nasa!govJsearch!)spL5M9::7::A9<74!
X>7Y 'etEel# @!-!# "n e*perimental analysis of program verification methods! $n 1niversity of
South -arolina! Doctoral Dissertation# 37<;!
XA9Y 'ovemeyer# D!# Pugh#@! Finding concurrency ugs in Zava! $n Proceedings of the 9>rd
"nnual "-M S$,"-&-S$,2PS Symposium on Principles of Distriuted -omputing# 9::A!
X39Y $BM -ognos: http:JJwww-:3!im!comJsoftwareJanalyticsJcognosJ
'9( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X3>Y $BM SE 'armony: http:JJwww-:3!im!comJsoftwareJrationalJservicesJharmonyJ
X3BY $EEE Standard for Software 1nit &esting# 377;
X3AY $EEE# $EEE Standard ,lossary of Software# 377:!
X3:Y $S2J$E- 3BB:A-B:9:39 $nformation technology -- Process assessment -- Part B: "n e*emplar
software life cycle process assessment model
http:JJwww!iso!orgJisoJhomeJstoreJcataloguehtcJcataloguehdetail!htmLcsnumerM;:BBB
X>9Y Zalote P!# 'aragopal M!# 2vercoming the nah syndrome for inspection deployment! 9:th
$nternational -onference on Software Engineering $-SE# pages ><3K><4# 3774!
X<<Y Zava EE ; SDD http:JJwww!oracle!comJtechnetworkJ)ava J)avaeeJoverviewJinde*!html!
XB3Y Zuristo# %!# 0egas# S! 9::>! Functional &esting# Structural &esting# and -ode 5eading: @hat
Fault &ype Do &hey Each DetectL $n 6ecture %otes in -omputer Science! 9<;B# 9:4-9>9# 9::>!
X>>Y Damsties E!# 6ott -! M!# "n empirical evaluation of three defect-detection techniGues! Fifth
European Software Engineering -onference# 377B!
X<3Y Dan S! '!# Metrics and Models in Software =uality Engineering! "ddison-@elsey# 9::9!
X7>Y Diry 5oion -!# Sieenmann 6aurence -!: Foundational Essays on &opological Manifolds!
Smoothings# and &riangulations! Princeton 1niversity Press# 37<<!
X;:Y Ditchenham B!# 6inkman S! 0alidation# verification# and testing: diversity rules! $n $EEE
Software# 3B# A# A;-A7# 3774!
XB4Y 6aitenerger 2!# Freimut B!# Schlich M!# &he comination of inspection and testing
technologies! $n Fraunhofer $ESE! &echnical 5eport :<:!:9JE# 9::9!
XABY 6aitenerger 2!# Freimut B!# Schlich M!# &he comination of inspection and testing
technologies! &!5! :<:!:9JE# 9::9!
XB<Y 6aitenerger 2!# Studying the effects of code inspection and structural testing on software
Guality! $n Proceedings of the %inth $nternational Symposium on Software 5eliaility
Engineeringm 3774!
X4<Y 6ennon Z!: ,et started with Do)o Moile 3!; - "nd get a peek at new features coming in 3!<#
$BM developer@orks! http:JJwww!im!comJdeveloperworksJweJliraryJwa-
do)omoileJinde*!html
X;<Y 6i S!# @aters 5! %ucleotide level detection of cycloutane pyrimidine dimers using
oligonucleotides and magnetic eads to facilitate laelling of D%" fragments incised# at the
dimers and chemical seGuencing reference ladders! -arcinogenesis! 3<# 4# 3BA7-3BB9# 377;!
'9' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

XA>Y 6ian (u# Zun Chou# (ue (i# Zianchu Fan# =ian*iang @ang! " 'yrid "pproach to Detecting
Security Defects in Programs! $n $nternational -onference on =uality Software# 9::7!
XAAY 6ong# B!# Strooper# P!# 'offman# D!# &ool support for &esting -oncurrent Zava -omponents!
$n -oncurrency and -omputation#Practice and E*perience# 9::;!
X47Y 6ukasavage &!: 5eview: "ppcelerator vs! Phone,ap vs! "doe "ir!
http:JJsavagelook!comJlogJportfolioJappcelerator-vs-phonegap-vs-adoe-air!
X9<Y Masi E!# -antone ,!# Mastrofini M!# -alavaro ,!# Suiaco P!# Moile "pps Development: "
Framework for &echnology Decision Making! Ath $nternational -onference on Moile
-omputing# "pplications and Services# 9:39!
X9;Y Mastrofini M!# -antone ,!# Shull F!# Diep M!# Seaman -!B!# Falessi D!# Enhancing the System
Development Process Performance: a 0alue-Based "pproach! $%-2SE $nternational Symposium
9:39!
X34Y Mastrofini# M!# Diep# M!# Shull# F!# Seaman# -!# ,odfrey# S!# "ssessing the =uality of a
Systems $nspection Process! $n Proceedings of the 3Ath %D$" "nnual Systems Engineering
-onference# 9:3:!
X<>Y Mc-onnell S!# ,auging software readiness with defect tracking! $n $EEE Software! 3A# >#
3>B-3>;# 377<!
X3Y MenEies# &!# Benson M!# -ostello D!# Moats -!# %orthey M!# and 5ichardson Z!! 6earning
Better $0 \ 0 Practices! $nnovations in Systems and Software Engineering A.9/: 3;7K34># 9::4!
X;7Y Miller '!5!# SeGuencing and prior information in linear programmed instruction!
Educational &echnology 5esearch and Development! 3<# 3# ;>-<;# 37B>!
XBY Morisio M!# Stamelos $!# &soukijs "!# Software Product "nd Process "ssessment &hrough
Profile-Based Evaluation# $nternational Zournal of Software Engineering and Dnowledge
Engineering# 9::>!
X9Y Morisio M!# &soukijs "!# $us@are: a methodology for the evaluation and selection of
software products# $EE Proceedings Software Engineering# 3;9-3<A# 377<!
XB>Y Musuvathi# M!# Engler# D!# Some 6essons from 1sing Static "nalysis and Software Model
-hecking for Bug Finding! $n Electronic %otes in &heoretical -omputer Science! 47# ># 9::>!
XA:Y Myers# ,!Z!# " controlled e*periment in program testing and code walkthroughsJinspection!
$n -ommunications of the "-M# 37<4!
X94Y %"S"-S&D 99:9-7># 377>! http:JJwww!hG!nasa!govJofficeJcodeGJdoctreeJ99:97>!htm!
'92 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X<AY %euffellder "! M!# -urrent Defect Density Statistics! www!softrel!comJ-urrent defect
density statistics!pdf# 9::<!
X<:Y 2man P!# Pfleeger S! 6!# "pplying software metrics! @iley-$EEE -omputer Society Press#
377<!
X>:Y 2sterweil# 6!# Strategic directions in software Guality! $n "-M -omputing Surveys! 94# A#
377;!
XB:Y 2wen# D!# Desovski# D!# -ukic# B!# Effectively -omining Software 0erification Strategies:
1nderstanding Different "ssumptions! $n $nternational Symposium on Software 5eliaility
Engineering# 9::;!
X;3Y Porter "!"!# 0otta 6!,!# Basili 0!5!# -omparing detection methods for software
reGuirements inspections: a replicated e*periment! $n $EEE &ransactions on Software
Engineering! 93# ;# B;>-B<B# 377B!
X4BY 5owerg Z!: -omparison: "pp $nventor# DroidDraw# 5homoile# Phone,ap# "ppcelerator#
@e0iew# and "M6! http:JJwww!amlcode!comJ9:3:J:<J3;Jcomparison-appinventor-
rhomoile-phonegap-appcelerator-weview-and-aml
X<9Y Shah S!M!"!# Morisio M!# &orchiano M!# "n 2verview of Software Defect Density: " Scoping
Study! "PSE- 9:39: A:;-A3B# 9:39!
X9AY Shell E!# Shull F!: ,SF- FSB "pplication of Perspective-Based $nspections! %"S"
&59::;::993<; 9::A: http:JJntrs!nasa!govJsearch!)spL5M9::;::993<;!
X9:Y Shull F!# Seaman -! B!# Feldmann 5! 6!# 'aingaertner 5!# 5egardie M!: $nterim 5eport on
'euristics aout $nspection Parameters: 1pdates to 'euristics 5esulting from 5efinement on
Pro)ects# 9::4!
X7<Y Shull# F! Seaman -!B! Diep M!# 'ughes "!# "nalyEing $nspection Data for 'euristic
Effectiveness! $n Proceedings of Empirical Software Engineering and Measurement# 9:39!
X>3Y Shull# F!# Basili# 0!5!# Boehm# B!# Brown# "!@!# -osta# P!# 6indvall# M!# Port# D!# 5us# $!#
&esoriero# 5!# CelkowitE# M! @hat we have learned aout fighting defects# pp! $n Proceedings of
the Eighth $nternational Symposium on Software Metrics# 9::9!
X9>Y Sprugnoli 5!# "n introduction to mathematical methods in cominatorics!
http:JJwww!dsi!unifi!itJbrespJ'andook!pdf# 9::;
X99Y Strooper# P!# @o)cicki# M!"! 9::<! Selecting 0\0 &echnology -ominations: 'ow to Pick a
@innerL $n Proceedings of the 39th $EEE $nternational -onference on Engineering -omple*
-omputer Systems# 9::<
'93 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X33Y &ennant ,!# SP- and &=M in Manufacturing and Services - =uality 5eturns to "merica
,ower Pulishing# 6td# 9::3!
X>AY &helin &!# "ndersson -!# 5uneson P!# DEamashvili-Fogelstrom %!# " replicated e*periment of
usage-ased and checklist-ased reading! Metrics! 3:th $nternational Symposium on Software
Metrics# 9::A!
X>BY &helin &!# 5uneson P!# @ohlin -!! "n e*perimental comparison of usage-ased and
checklist-ased reading! $EEE &rans! Softw! Eng!# 97.4/:;4<-<:A# 9::>!
X7:Y 0ilches "!: Phone,ap vs! Fle* vs! "ppcelerator vs! -orona: nuevas conclusiones!
http:JJwww!*ydo!comJtoolarJ9<3;B;B4-
phonegaphvshfle*hvshappceleratorhvshcoronahnuevashconclusiones
X4>Y 0isionMoile: Developer Economics 9:3: and Beyonds!
http:JJwww!visionmoile!comJlogJ9:3:J:<Jdeveloper-economics-9:3:-the-role-of-networks-
in-a-developer-worldJ
X4AY @asserman "!$!: Software engineering issues for moile application development!# "-M
S$,S2F& FoSE5 9:3: .9:3:/ http:JJwww!cmu!eduJsilicon-valleyJwmseJwasserman-
foser9:3:!pdf
XA;Y @eller E! F!# 6essons from three years of inspection data! $EEE Software# 3:.B/:>4!AB# 377>!
X4:Y @ohlin -!# 5uneson P!# 'orst M!# 2hlson M!-!# 5egnel B!# @esslen "!: E*perimentation in
Software Engineering! "n introduction! Dluwer "cademic Pulishers# 9:::!
XA7Y @o)cicki M!"!# Strooper P! "n $terative Empirical Strategy for the Systematic Selection of a
-omination of 0erification and 0alidation &echnologies! $n Fifth $nternational @orkshop on
Software =uality# 9::<!
XBAY @o)cicki M!"!# Strooper P!# Ma*imising the $nformation ,ained from E*perimental "nalysis
of -ode $nspection and Static "nalysis for -oncurrent Zava -omponents! $n Fifth "-M-$EEE
$nternational Symposium on Empirical Software Engineering# 9::;!
XBBY @ood M!# 5oper M!# Brooks "!# Miller Z!# -omparing and comining software defect
detection techniGues: " replicated study! $n Proceedings of the Si*th European Software
Engineering -onference# 377<!
X49Y @ork6ight @einar Series: %ative @e or 'yrid Moile "pp Development!
http:JJwww!worklight!comJassetsJfilesJ%ative-@e-'yrid-Moile-"pp-Dev-@einar!pdf
X44Y (ao M!: @hat are the pros and cons of using Phone,ap to uild native appsL!
http:JJwww!Guora!comJ@hat-are-the-pros-and-cons-of-using-Phone,ap-to-uild-native-apps!
'94 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

XA4Y (oo S!# 'arman M!# 5egression testing minimiEation# selection and prioritiEation: a survey!
$n Software &esting# 0erification and 5eliaility# 99#;<-39:# 9:39!
XB;Y (uting -hen# Shaoying 6iu# @ong# @!E! " Method -omining 5eview and &esting for
0erifying Software Systems! $n $nternational -onference on BioMedical Engineering and
$nformatics# 9::4!
X<4Y CelkowitE M!0!# @allace D!5!: E*perimental Models for 0alidating &echnology# $EEE
-omputer >3# B# 3774!

References in $ppearance )rder
X3Y MenEies# &!# Benson M!# -ostello D!# Moats -!# %orthey M!# and 5ichardson Z!! 6earning
Better $0 \ 0 Practices! $nnovations in Systems and Software Engineering A.9/: 3;7K34># 9::4!
X9Y Morisio M!# &soukijs "!# $us@are: a methodology for the evaluation and selection of
software products# $EE Proceedings Software Engineering# 3;9-3<A# 377<!
X>Y -antone ,!# DonEelli P! H,oal-oriented Software Measurement ModelsI# in HPro)ect -ontrol
for Software =ualityI# Editors# 5! Dusters# "! -owderoy# F! 'eemstra# and E! van 0eenendaal#
Shaker Pulishing# 3777!
XAY -antone ,!# H" Measure-driven Process and "rchitecture for the Empirical $nvestigation of
Software &echnologyI# Zournal of Software Maintenance# 9:::!
XBY Morisio M!# Stamelos $!# &soukijs "!# Software Product "nd Process "ssessment &hrough
Profile-Based Evaluation# $nternational Zournal of Software Engineering and Dnowledge
Engineering# 9::>!
X;Y Basili# 0!5!# &he E*perience Factory and its 5elationship to 2ther $mprovement Paradigms# in
6ecture %otes in -omputer Science <3<# Proceedings of ESE-_7># Springer-0erlag# 377>!
X<Y Basili# 0!5!# -aldiera ,!# -antone ,!# " 5eference "rchitecture for the -omponent Factory#
"-M &ransactions on Software Engineering and Methodology .&2SEM/# vol! 3.3/: B>-4:# 3779!
X4Y Basili# 0!5!# -aldiera ,!# 5omach '!D!# &he E*perience Factory# Encyclopedia of Software
Engineering# pp! A;7-A<;# Zohn @iley \ Sons# $nc!# 377A!
X7Y -MM$: http:JJwww!sei!cmu!eduJcmmiJinde*!cfm
X3:Y $S2J$E- 3BB:A-B:9:39 $nformation technology -- Process assessment -- Part B: "n e*emplar
software life cycle process assessment model
http:JJwww!iso!orgJisoJhomeJstoreJcataloguehtcJcataloguehdetail!htmLcsnumerM;:BBB
'9- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X33Y &ennant ,!# SP- and &=M in Manufacturing and Services - =uality 5eturns to "merica
,ower Pulishing# 6td# 9::3!
X39Y $BM -ognos: http:JJwww-:3!im!comJsoftwareJanalyticsJcognosJ
X3>Y $BM SE 'armony: http:JJwww-:3!im!comJsoftwareJrationalJservicesJharmonyJ
X3AY $EEE# $EEE Standard ,lossary of Software# 377:!
X3BY $EEE Standard for Software 1nit &esting# 377;
X3;Y Basili# 0!5!# Boehm# B!# -eB"SE Software Defect 5eduction &op 3:-6ist! $n Software
Engineering# y Sely# 5!@# 9::<!
X3<Y Bush M!: $mproving software Guality - &he use of formal inspections at the Zet Propulsion
6aoratory! %"S" &53773::>B3B7 377:! http:JJntrs!nasa!govJsearch!)spL5M3773::>B3B7!
X34Y Mastrofini# M!# Diep# M!# Shull# F!# Seaman# -!# ,odfrey# S!# "ssessing the =uality of a
Systems $nspection Process! $n Proceedings of the 3Ath %D$" "nnual Systems Engineering
-onference# 9:3:!
X37Y 'e 6!# Shull F!: " 5eference Model for Software and System $nspections! %"S"
&5::7::A9<74 9::7: http:JJntrs!nasa!govJsearch!)spL5M9::7::A9<74!
X9:Y Shull F!# Seaman -! B!# Feldmann 5! 6!# 'aingaertner 5!# 5egardie M!: $nterim 5eport on
'euristics aout $nspection Parameters: 1pdates to 'euristics 5esulting from 5efinement on
Pro)ects# 9::4!
X93Y Briand# 6!-!# 1sing Simulation to Empirically $nvestigate &est -overage -riteria Based on
Statechart! $n Proceedings of the $nternational -onference on Software Engineering 9::A!
X99Y Strooper# P!# @o)cicki# M!"! 9::<! Selecting 0\0 &echnology -ominations: 'ow to Pick a
@innerL $n Proceedings of the 39th $EEE $nternational -onference on Engineering -omple*
-omputer Systems# 9::<
X9>Y Sprugnoli 5!# "n introduction to mathematical methods in cominatorics!
http:JJwww!dsi!unifi!itJbrespJ'andook!pdf# 9::;
X9AY Shell E!# Shull F!: ,SF- FSB "pplication of Perspective-Based $nspections! %"S"
&59::;::993<; 9::A: http:JJntrs!nasa!govJsearch!)spL5M9::;::993<;!
X9BY Basili# 0!5!# ,reen# S!# 6aitenerger# 2!# 6anuile# F!# Shull# F!# Serumgfrd# S!# CelkowitE#
M!0!# &he empirical investigation of Perspective-Based 5eading! $n Empirical Software
Engineering! 3# 9# 3>>-3;A# 377;!
'92 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X9;Y Mastrofini M!# -antone ,!# Shull F!# Diep M!# Seaman -!B!# Falessi D!# Enhancing the System
Development Process Performance: a 0alue-Based "pproach! $%-2SE $nternational Symposium
9:39!
X9<Y Masi E!# -antone ,!# Mastrofini M!# -alavaro ,!# Suiaco P!# Moile "pps Development: "
Framework for &echnology Decision Making! Ath $nternational -onference on Moile
-omputing# "pplications and Services# 9:39!
X94Y %"S"-S&D 99:9-7># 377>! http:JJwww!hG!nasa!govJofficeJcodeGJdoctreeJ99:97>!htm!
X97Y Biffl# S!# 'alling# M!# Software product improvement with inspection! " large-scale
e*periment on the influence of inspection processes on defect detection in software
reGuirements documents! $n Proceedings of the 9;th Euromicro -onference# 9:::!
X>:Y 2sterweil# 6!# Strategic directions in software Guality! $n "-M -omputing Surveys! 94# A#
377;!
X>3Y Shull# F!# Basili# 0!5!# Boehm# B!# Brown# "!@!# -osta# P!# 6indvall# M!# Port# D!# 5us# $!#
&esoriero# 5!# CelkowitE# M! @hat we have learned aout fighting defects# pp! $n Proceedings of
the Eighth $nternational Symposium on Software Metrics# 9::9!
X>9Y Zalote P!# 'aragopal M!# 2vercoming the nah syndrome for inspection deployment! 9:th
$nternational -onference on Software Engineering $-SE# pages ><3K><4# 3774!
X>>Y Damsties E!# 6ott -! M!# "n empirical evaluation of three defect-detection techniGues! Fifth
European Software Engineering -onference# 377B!
X>AY &helin &!# "ndersson -!# 5uneson P!# DEamashvili-Fogelstrom %!# " replicated e*periment of
usage-ased and checklist-ased reading! Metrics! 3:th $nternational Symposium on Software
Metrics# 9::A!
X>BY &helin &!# 5uneson P!# @ohlin -!! "n e*perimental comparison of usage-ased and
checklist-ased reading! $EEE &rans! Softw! Eng!# 97.4/:;4<-<:A# 9::>!
X>;Y "delnai# C!# -antone# ,!# -iolkowski# M!# 5omach# '!D! -omparing -ode 5eading
&echniGues "pplied to 2)ect-2riented Software Frameworks with 5egard to Effectiveness and
Defect Detection 5ate! $n Proceedings of $SESE 9::A# 9>7-9A4# 9::A!
X><Y -antone# ,!# "dulnai# C!"!# 6omartire# "!# -alavaro# ,! Effectiveness of code reading and
functional testing with event-driven o)ect-oriented software! $n 6ecture %otes in -omputer
Science! 9<;B# 3;;-379# 9::>!
X>4Y Basili# 0!5!# Sely# 5!@!# -omparing the Effectiveness of Software &esting Strategies! $n $EEE
&ransactions on Software Engineering!3>#39#39<4-397;# 374<!
'99 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X>7Y 'etEel# @!-!# "n e*perimental analysis of program verification methods! $n 1niversity of
South -arolina! Doctoral Dissertation# 37<;!
XA:Y Myers# ,!Z!# " controlled e*periment in program testing and code walkthroughsJinspection!
$n -ommunications of the "-M# 37<4!
XA3Y BeiEer# B!# Black-Bo* &esting: &echniGues for Functional &esting of Software and Systems#
377B!
XA9Y 'alling# M!# Biffl# S!# ,runacher# P!# "n e*periment family to investigate the defect
detection effect of tool-support for reGuirements inspection! $n Proceedings of the %inth
$nternational Software Metrics Symposium# 9::>!
XA9Y 'ovemeyer# D!# Pugh#@! Finding concurrency ugs in Zava! $n Proceedings of the 9>rd
"nnual "-M S$,"-&-S$,2PS Symposium on Principles of Distriuted -omputing# 9::A!
XA>Y 6ian (u# Zun Chou# (ue (i# Zianchu Fan# =ian*iang @ang! " 'yrid "pproach to Detecting
Security Defects in Programs! $n $nternational -onference on =uality Software# 9::7!
XAAY 6ong# B!# Strooper# P!# 'offman# D!# &ool support for &esting -oncurrent Zava -omponents!
$n -oncurrency and -omputation#Practice and E*perience# 9::;!
XABY 6aitenerger 2!# Freimut B!# Schlich M!# &he comination of inspection and testing
technologies! &!5! :<:!:9JE# 9::9!
XA;Y @eller E! F!# 6essons from three years of inspection data! $EEE Software# 3:.B/:>4!AB# 377>!
XA<Y ElerEhager F!# Minch Z!# 5omach '! D!# Freimut B!,!# 2ptimiEing cost and Guality y
integrating inspection and test processes! $nternational -onference on Software and System
Process# 9:33!
XA4Y (oo S!# 'arman M!# 5egression testing minimiEation# selection and prioritiEation: a survey!
$n Software &esting# 0erification and 5eliaility# 99#;<-39:# 9:39!
XA7Y @o)cicki M!"!# Strooper P! "n $terative Empirical Strategy for the Systematic Selection of a
-omination of 0erification and 0alidation &echnologies! $n Fifth $nternational @orkshop on
Software =uality# 9::<!
XB:Y 2wen# D!# Desovski# D!# -ukic# B!# Effectively -omining Software 0erification Strategies:
1nderstanding Different "ssumptions! $n $nternational Symposium on Software 5eliaility
Engineering# 9::;!
XB3Y Zuristo# %!# 0egas# S! 9::>! Functional &esting# Structural &esting# and -ode 5eading: @hat
Fault &ype Do &hey Each DetectL $n 6ecture %otes in -omputer Science! 9<;B# 9:4-9>9# 9::>!
'9, | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

XB>Y Musuvathi# M!# Engler# D!# Some 6essons from 1sing Static "nalysis and Software Model
-hecking for Bug Finding! $n Electronic %otes in &heoretical -omputer Science! 47# ># 9::>!
XBAY @o)cicki M!"!# Strooper P!# Ma*imising the $nformation ,ained from E*perimental "nalysis
of -ode $nspection and Static "nalysis for -oncurrent Zava -omponents! $n Fifth "-M-$EEE
$nternational Symposium on Empirical Software Engineering# 9::;!
XBBY @ood M!# 5oper M!# Brooks "!# Miller Z!# -omparing and comining software defect
detection techniGues: " replicated study! $n Proceedings of the Si*th European Software
Engineering -onference# 377<!
XB;Y (uting -hen# Shaoying 6iu# @ong# @!E! " Method -omining 5eview and &esting for
0erifying Software Systems! $n $nternational -onference on BioMedical Engineering and
$nformatics# 9::4!
XB<Y 6aitenerger 2!# Studying the effects of code inspection and structural testing on software
Guality! $n Proceedings of the %inth $nternational Symposium on Software 5eliaility
Engineeringm 3774!
XB4Y 6aitenerger 2!# Freimut B!# Schlich M!# &he comination of inspection and testing
technologies! $n Fraunhofer $ESE! &echnical 5eport :<:!:9JE# 9::9!
XB7Y Bansal P!# Saharwal S!# Sidhu P!# "n investigation of strategies for finding test order during
$ntegration testing of o)ect 2riented applications! $n Proceeding of $nternational -onference
on Methods and Models in -omputer Science# 9::7!
X;:Y Ditchenham B!# 6inkman S! 0alidation# verification# and testing: diversity rules! $n $EEE
Software# 3B# A# A;-A7# 3774!
X;3Y Porter "!"!# 0otta 6!,!# Basili 0!5!# -omparing detection methods for software
reGuirements inspections: a replicated e*periment! $n $EEE &ransactions on Software
Engineering! 93# ;# B;>-B<B# 377B!
X;9Y ElerEhager F!# Minch Z!# %goc %ha 0!# " systematic mapping study on the comination of
static and dynamic Guality assurance techniGues! $nformation \ Software &echnology! BA.3/# 3-
3B# 9:33!
X;>Y ElerEhager F!# 5osach "!# Minch Z!# Eschach 5# $nspection and &est Process $ntegration
Based on E*plicit &est PrioritiEation Strategies! $nternational Software =uality Days -onference!
343-379# 9:39!
X;AY ,upta "!# Zalote P!# &est $nspected 1nit or $nspect 1nit &ested -odeL# $nternational
Symposium on Empirical Software Engineering and Measurement# pp!B3-;:# 9::<!
'9: | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X;BY "ishman B!# 'ormone 5efractory Prostate -ancer - -omined 'ormonal
"lationJ-hemotherapy for 'igh 5isk Prostate -ancer!
http:JJwww!hrpca!orgJchemowaearly!htm# 9::9!
X;;Y -raighead @!E!# -raighead 6!@!# Manual-Based &reatments: Suggestions for $mproving
&heir -linical 1tility and "cceptaility! -linical Psychology: Science and Practice! B# ># A:>KA:<#
3774!
X;<Y 6i S!# @aters 5! %ucleotide level detection of cycloutane pyrimidine dimers using
oligonucleotides and magnetic eads to facilitate laelling of D%" fragments incised# at the
dimers and chemical seGuencing reference ladders! -arcinogenesis! 3<# 4# 3BA7-3BB9# 377;!
X;4Y ,aard ,! 2! &reatment of Posttraumatic disorder! "merican Psychiatric Pulishing $nc#
9::<!
X;7Y Miller '!5!# SeGuencing and prior information in linear programmed instruction!
Educational &echnology 5esearch and Development! 3<# 3# ;>-<;# 37B>!
X<:Y 2man P!# Pfleeger S! 6!# "pplying software metrics! @iley-$EEE -omputer Society Press#
377<!
X<3Y Dan S! '!# Metrics and Models in Software =uality Engineering! "ddison-@elsey# 9::9!
X<9Y Shah S!M!"!# Morisio M!# &orchiano M!# "n 2verview of Software Defect Density: " Scoping
Study! "PSE- 9:39: A:;-A3B# 9:39!
X<>Y Mc-onnell S!# ,auging software readiness with defect tracking! $n $EEE Software! 3A# >#
3>B-3>;# 377<!
X<AY %euffellder "! M!# -urrent Defect Density Statistics! www!softrel!comJ-urrent defect
density statistics!pdf# 9::<!
X<BY "ndrews Z! '!# Briand 6! -!# 6aiche (!! $s mutation an appropriate tool for testing
e*perimentsL Proceedings of the 9<th international conference on Software engineering# 9::B!
X<;Y Eclipse 'elios: http:JJwww!eclipse!orgJheliosJ
X<<Y Zava EE ; SDD http:JJwww!oracle!comJtechnetworkJ)ava J)avaeeJoverviewJinde*!html!
X<4Y CelkowitE M!0!# @allace D!5!: E*perimental Models for 0alidating &echnology# $EEE
-omputer >3# B# 3774!
X<7Y Basili 0!5!# @eiss D!: " methodology for collecting valid software engineering data! $EEE
&ransactions on Software Engineering 3:.;/# 374A!
',( | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X4:Y @ohlin -!# 5uneson P!# 'orst M!# 2hlson M!-!# 5egnel B!# @esslen "!: E*perimentation in
Software Engineering! "n introduction! Dluwer "cademic Pulishers# 9:::!
X43Y Fling B!: Moile Design and Development: Practical -oncepts and &echniGues for -reating
Moile Sites and @e "pps# 2_5eilly Media# pp! 3>-9<# $SB% 7<4-:-B7;-3BBAA-B# 9::7!
X49Y @ork6ight @einar Series: %ative @e or 'yrid Moile "pp Development!
http:JJwww!worklight!comJassetsJfilesJ%ative-@e-'yrid-Moile-"pp-Dev-@einar!pdf
X4>Y 0isionMoile: Developer Economics 9:3: and Beyonds!
http:JJwww!visionmoile!comJlogJ9:3:J:<Jdeveloper-economics-9:3:-the-role-of-networks-
in-a-developer-worldJ
X4AY @asserman "!$!: Software engineering issues for moile application development!# "-M
S$,S2F& FoSE5 9:3: .9:3:/ http:JJwww!cmu!eduJsilicon-valleyJwmseJwasserman-
foser9:3:!pdf
X4BY 5owerg Z!: -omparison: "pp $nventor# DroidDraw# 5homoile# Phone,ap# "ppcelerator#
@e0iew# and "M6! http:JJwww!amlcode!comJ9:3:J:<J3;Jcomparison-appinventor-
rhomoile-phonegap-appcelerator-weview-and-aml
X4;Y ,oogle $J2: '&M6B versus "ndroid: "pps or @e for Moile DevelopmentL
http:JJwww!google!comJeventsJioJ9:33JsessionsJhtmlB-versus-android-apps-or-we-for-
moile-development!html
X4<Y 6ennon Z!: ,et started with Do)o Moile 3!; - "nd get a peek at new features coming in 3!<#
$BM developer@orks! http:JJwww!im!comJdeveloperworksJweJliraryJwa-
do)omoileJinde*!html
X44Y (ao M!: @hat are the pros and cons of using Phone,ap to uild native appsL!
http:JJwww!Guora!comJ@hat-are-the-pros-and-cons-of-using-Phone,ap-to-uild-native-apps!
X47Y 6ukasavage &!: 5eview: "ppcelerator vs! Phone,ap vs! "doe "ir!
http:JJsavagelook!comJlogJportfolioJappcelerator-vs-phonegap-vs-adoe-air!
X7:Y 0ilches "!: Phone,ap vs! Fle* vs! "ppcelerator vs! -orona: nuevas conclusiones!
http:JJwww!*ydo!comJtoolarJ9<3;B;B4-
phonegaphvshfle*hvshappceleratorhvshcoronahnuevashconclusiones
X73Y Dingsor "!: @riting a 'yrid Moile "pllication with Phone,ap and the Do)o &oolkit# $BM!
http:JJpulic!dhe!im!comJsoftwareJdwJwe9moileJ:<:<9:33hdingsorJhellohyrid!pdf#
9:33!
X79Y -antone ,!# DonEelli P!: Software Measurements: from -oncepts to Production# &!5! $ntl!
Software Engineering research %etwork# $SE5% &!5! 7<-9<# 377<!
',' | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

X7>Y Diry 5oion -!# Sieenmann 6aurence -!: Foundational Essays on &opological Manifolds!
Smoothings# and &riangulations! Princeton 1niversity Press# 37<<!
X7AY -antone ,!# DonEelli P!: Production and Maintenance of Software Measurement Models!
Zournal of Software Engineering and Dnowledge Engineering# 0ol! B# 3774!
X7BY Boehm B!@!: 0alue-ased software engineering: 2verview and agenda! &ech report# 1S--
-SE-9::B-B:A# 1niversity of Southern -alifornia# Park -ampus! 6os "ngeles# -"# 1S"# 9::B!
X7;Y Basili# 0!5!: =uantitative evaluation of software engineering methodology! Proc! First Pan
Pacigc -omputer -onf!# Melourne# "ustralia# 3:-3> Septemer# 374B!
X7<Y Shull# F! Seaman -!B! Diep M!# 'ughes "!# "nalyEing $nspection Data for 'euristic
Effectiveness! $n Proceedings of Empirical Software Engineering and Measurement# 9:39!
X74Y -hillarege# 5!# Bhandari# $!S!# -haar# Z!D!# 'alliday# M!Z!# Moeus# D!S!# 5ay# B!D!# @ong# M!-
(!# 2rthogonal Defect -lassification: " concept for in-Process Measurements! $n $EEE
&ransactions on Software Engineering! 34# 33# 7A>-7B;# 3779!

;ist of Papers "n*estigated dring Testing ;iteratre Sr*e'
=onl' titles>
%otice that the following list includes only papers already investigated! " few hundred papers
are still eing studied and are not included in this list!
" clustering approach to improving test case prioritiEation: "n industrial case study
" cominatorial testing strategy for concurrent programs
" -omparative Study of Software Model -heckers as 1nit &esting &ools: "n $ndustrial -ase
Study
" -omparison of &aular E*pression-Based &esting Strategies
" concept analysis inspired greedy algorithm for test suite minimiEation
" -ontrolled E*periment "ssessing &est -ase PrioritiEation &echniGues via Mutation Faults
" history-ased cost-cogniEant test case prioritiEation techniGue in regression testing
" lightweight process for change identification and regression test selection in using -2&S
components
" Metaheuristic "pproach to &est SeGuence ,eneration for "pplications with a ,1$
" novel approach to regression test selection for Z9EE applications
" novel co-evolutionary approach to automatic software ug g*ing
" platform for search-ased testing of concurrent software
" prioritiEation approach for software test cases ased on Bayesian networks
',2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

" relation-ased method comining functional and structural testing for test case
generation
" Safe 5egression &est Selection &echniGue for Dataase-Driven "pplications
" stateful approach to testing monitors in multithreaded programs
" &heoretical and Empirical "nalysis of the 5ole of &est SeGuence 6ength in Software &esting
for Structural -overage
" &heoretical and Empirical Study of Search-Based &esting: 6ocal# ,loal# and 'yrid Search
" uniform random test data generator for path testing
"daptive 5andom &est -ase PrioritiEation
"daptive random testing ased on distriution metrics
"n approach for Gos-aware service composition ased on genetic algorithms
"n "pproach to &est Data ,eneration for Dilling Multiple Mutants
"n automated approach to reducing test suites for testing retargeted - compilers for
emedded systems
"n Empirical -omparison of "utomated ,eneration and -lassification &echniGues for
2)ect-2riented 1nit &esting
"n Empirical -omparison of &est Suite 5eduction &echniGues for 1ser-Session-Based &esting
of @e "pplications
"n empirical study of incorporating cost into test suite reduction and prioritiEation
"n empirical study of regression test application freGuency
"n Empirical Study of 5egression &esting &echniGues $ncorporating -onte*t and 6ifecycle
Factors and $mproved -ost-Benefits Models
"n Empirical Study of &est -ase Filtering &echniGues Based on E*ercising $nformation Flows
"n Empirical Study of the Effect of &ime -onstraints on the -ost-Benefits of 5egression
&esting
"n empirical# path-oriented approach to software analysis and testing
"n Enhanced &est -ase Selection "pproach for Model-Based &esting: "n $ndustrial -ase
Study
"n e*perimental evaluation of weak-ranch criterion for class testing
"n e*perimental study of adaptive testing for software reliaility assessment
"n improved meta-heuristic search for constrained interaction testing
"pplication of system models in regression test suite prioritiEation
"pplying $nterface--ontract Mutation in 5egression &esting of -omponent-Based Software
"ssessing and $mproving State-Based -lass &esting: " Series of E*periments
"ssessing# -omparing# and -omining State Machine-Based &esting and Structural &esting:
" Series of E*periments
',3 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

"1S&$%: " tool for Search Based Software &esting for the - 6anguage and its Evaluation on
Deployed "utomotive Systems
"utomated -oncolic &esting of Smartphone "pps
"utomated generation of test suites from formal specifications of real-time reactive
systems
"utomated Security &esting of @e @idget $nteractions
"utomatic generation of test cases from Boolean specifications using the M1M-1& strategy
"utomatic &est Data ,eneration 1sing ,enetic "lgorithm and Program Dependence ,raphs
"utomatic &est ,eneration From ,1$ "pplications For &esting @e Services
"utomatic# evolutionary test data generation for dynamic software testing
Black-o* system testing of real-time emedded systems using random and search-ased
testing
-all Stack -overage for &est Suite 5eduction
-all-stack coverage for gui test suite reduction
-arving and 5eplaying Differential 1nit &est -ases from System &est -ases
-arving Differential 1nit &est -ases from System &est -ases
-hecking $nside the Black Bo*: 5egression &esting Based on 0alue Spectra Differences
-hecking $nside the Black Bo*: 5egression &esting y -omparing 0alue Spectra!
-ode coverage-ased regression test selection and prioritiEation in @eDit
-ominatorial $nteraction 5egression &esting: " Study of &est -ase ,eneration and
PrioritiEation
-omining Search-ased and "daptive 5andom &esting Strategies for Environment Model-
ased &esting of 5eal-time Emedded Systems
-onfiguration selection using code change impact analysis for regression testing
-onstraint ased structural testing criteria
-ontract-Based Mutation for &esting -omponents
-1&E: a concolic unit testing engine for -
Data Flow &esting of Service -horeography
Decreasing the -ost of Mutation &esting with Second 2rder Mutants
Design and "nalysis of -ost--ogniEant &est -ase PrioritiEation 1sing ,enetic "lgorithm with
&est 'istory
Design and analysis of ,1$ test-case prioritiEation using weight-ased methods
Designing and comparing automated test oracles for ,1$-ased software applications
Developing a Single Model and &est PrioritiEation Strategies for Event-Driven Software
Differential &esting - " %ew "pproach to -hange Detection
Directed &est ,eneration using Symolic ,rammars
',4 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

Directed &est Suite "ugmentation: &echniGues and &radeoffs
Distriuting test cases more evenly in adaptive random testing
Efficient Software 0erification: Statistical &esting 1sing "utomated Search
Efficient unit test case minimiEation
Eliminating 'armful 5edundancy for &esting-Based Fault 6ocaliEation 1sing &est Suite
5eduction: "n E*perimental Study
Empirical evaluation of optimiEation algorithms when used in goal-oriented automated test
data generation techniGues
Empirical Evaluation of the Fault-Detection Effectiveness of Smoke 5egression &est -ases
for ,1$-Based Software
Empirical studies of a decentraliEed regression test selection framework for we services
Empirically Studying the 5ole of Selection 2perators during Search-Based &est Suite
PrioritiEation
Evaluating improvements to a meta-heuristic search for constrained interaction testing
Evolutionary generation of test data for many paths coverage ased on grouping
Evolutionary &est Data ,eneration: " -omparison of Fitness Functions
E*perimental assessment of manual versus tool-ased maintenance of ,1$-directed test
scripts
E*periments with test case prioritiEation using relevant slices
Fault coverage of -onstrained 5andom &est Selection for access control: " formal analysis
Finding Bugs y $solating 1nit &ests
,enetic "lgorithms for 5andomiEed 1nit &esting
,rouping target paths for evolutionary generation of test data in parallel
,uided test generation for coverage criteria
'andling over-fitting in test cost-sensitive decision tree learning y feature selection#
smoothing and pruning
'ighly Scalale Multi 2)ective &est Suite Minimisation 1sing ,raphics -ards
'2&&est: " model-ased test design techniGue for enhanced testing of domain-specific
applications
'ow Does Program Structure $mpact the Effectiveness of the -rossover 2perator in
Evolutionary &esting
$mproved Multithreaded 1nit &esting
$mproving Effectiveness of "utomated Software &esting in the "sence of Specifications
$mproving evolutionary class testing in the presence of non-pulic method
$mproving Fault Detection -apaility y Selectively 5etaining &est -ases during &est Suite
5eduction
',- | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

$mproving Structural &esting of 2)ect-2riented Programs via $ntegrating Evolutionary
&esting and Symolic E*ecution
$mproving &est -ase ,eneration for @e "pplications 1sing "utomated $nterface Discovery
$ncremental &est ,eneration for Software Product 6ines
$ndustrial e*periences with automated regression testing of a legacy dataase application
$ndustrial 5eal-&ime 5egression &esting and "nalysis 1sing Firewalls
$ntegrating &echniGues and &ools for &esting "utomation
$nter--onte*t -ontrol-Flow and Data-Flow &est "deGuacy -riteria for nes- "pplications
$terative conte*t ounding for systematic testing of multithreaded programs
Model-ased test prioritiEation heuristic methods and their evaluation
Model-Based &esting of -ommunity-Driven 2pen-Source ,1$ "pplications
MSeG,en: 2)ect-2riented 1nit-&est ,eneration via Mining Source -ode
Mutation 2perators for Spreadsheets
Mutation-Driven ,eneration of 1nit &ests and 2racles
2n test suite composition and cost-effective regression testing
2n the Effectiveness of the &est-First "pproach to Programming
2n the 1se of Mutation Faults in Empirical "ssessments of &est -ase PrioritiEation
&echniGues
2rdering Broken 1nit &ests for Focused Deugging
P"&: " pattern classification approach to automatic reference oracles for the testing of
mesh simplification programs
Perturation-ased user-input-validation testing of we applications
Productivity of &est Driven Development: " -ontrolled E*periment with Professionals
PrioritiEing component compatiility tests via user preferences
PrioritiEing Z1nit test cases in asence of coverage information
PrioritiEing Z1nit &est -ases: "n Empirical "ssessment and -ost-Benefits "nalysis
PrioritiEing tests for fault localiEation through amiguity group reduction
=uasi-random testing
=uota-constrained test-case prioritiEation for regression testing of service-centric systems
5andom &esting: &heoretical 5esults and Practical $mplications
5apid ^-rash &esting^ for -ontinuously Evolving ,1$-Based Software "pplications
5eachaility graph-ased test seGuence generation for concurrent programs
5eachaility testing of concurrent programs
5egression testing for component-ased software systems y enhancing change
information
5egression testing in Software as a Service: "n industrial case study
',2 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

5egression &esting 1M6 Designs
5epresentation Dependence &esting using Program $nversion
5esults from introducing component-level test automation and &est-Driven Development
Saturation-ased &esting of -oncurrent Programs
Scalale and Effective &est ,eneration for 5ole-Based "ccess -ontrol Systems
Scalale &est Data ,eneration from Multidimensional Models
Scaling 5egression &esting to 6arge Software Systems
Search "lgorithms for 5egression &est -ase PrioritiEation
Selecting a cost-effective test case prioritiEation techniGue
Should software testers use mutation analysis to augment a test setL
SiEe--onstrained 5egression &est -ase Selection 1sing Multicriteria 2ptimiEation
Stress &esting 5eal&ime Systems with ,enetic "lgorithms
Strong 'igher 2rder Mutation-Based &est Data ,eneration
Studying the Fault-Detection Effectiveness of ,1$ &est -ases for 5apidly Evolving Software
&est case prioritiEation using relevant slices
&est coverage optimiEation for large code prolems
&est generation via Dynamic Symolic E*ecution for mutation testing
&est $nput ,eneration 1sing Dynamic Programming
&est PrioritiEation 1sing System Models
&est suite reduction and prioritiEation with call trees
&est Suite 5eduction with Selective 5edundancy
&est-case prioritiEation with model-checkers
&esting -onte*t-"ware Middleware--entric Programs: " Data Flow "pproach and an 5F$D-
Based E*perimentation
&esting input validation in @e applications through automated model recovery
&he Effects of &ime -onstraints on &est -ase PrioritiEation: " Series of -ontrolled
E*periments
&he $mpact of $nput Domain 5eduction on Search-Based &est Data ,eneration
&he State Prolem for &est ,eneration in Simulink
&owards a distriuted e*ecution framework for Z1nit test cases
&owards the prioritiEation of regression test suites with data flow information
&raffic-aware stress testing of distriuted real-time systems ased on 1M6 models using
genetic algorithms
1nit-level test adeGuacy criteria for visual dataflow languages and a testing methodology
1sing "rtificial 6ife &echniGues to ,enerate &est -ases for -ominatorial &esting
1sing hyrid algorithm for Pareto efficient multi-o)ective test suite minimiEation
',9 | S o f t wa r e D e v e l o p me n t P r o c e s s E n h a n c e me n t : a F r a me wo r k f o r
Me a s u r e me n t - B a s e d D e c i s i o n - Ma k i n g

1sing Mutation "nalysis for "ssessing and -omparing &esting -overage -riteria
1sing random test selection to gain confidence in modified software
1sing the -ase-Based 5anking Methodology for &est -ase PrioritiEation
1tiliEation of E*tended Firewall for 2)ect-2riented 5egression &esting
8M6-manipulating test case prioritiEation for 8M6-manipulating services

Você também pode gostar