Você está na página 1de 81

Verification & Validation

Am I building the thing right?


Am I building the right thing?

by John W. Brand
4/6/2009 IIBA - Pittsburgh Chapter 1
Agenda

 Define the terms


 Discuss useful techniques
 Outline risks

4/6/2009 IIBA - Pittsburgh Chapter 2


What’s the difference?

4/6/2009 IIBA - Pittsburgh Chapter 3


Interchangeable?

 Verification and Validation have been used


interchangeably by many people.
 However, they are distinctly different.

4/6/2009 IIBA - Pittsburgh Chapter 4


According to the IEEE

 Verification is, “The process of evaluating a


system or component to determine whether
the products of a given development phase
satisfy the conditions imposed at the start of
that phase.“
 Validation is, "The process of evaluating a
system or component during or at the end of
the development process to determine
whether it satisfies specified requirements."

4/6/2009 IIBA - Pittsburgh Chapter 5


According to Karl Wiegers

 Verification determines whether the product


of a development activity meets the
requirements established for it.
 Validation assesses whether a product
actually satisfies the customer needs.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 6


According to IIBA (1.6)

 Verification ensures that requirements are


defined clearly enough to allow solution
design and implementation to begin.
 Validation ensures that the stated
requirements correctly and fully implement
the business requirements.

IIBA, 2006, BABOK, Release 1.6

4/6/2009 IIBA - Pittsburgh Chapter 7


According to IIBA (2.0)

 Requirements verification involves


evaluating requirements to verify that they
meet quality specifications.
 Requirements validation ensures that all
requirements support the delivery of value to
the business, fulfill its goals and objectives,
and or meet a stakeholder need.

IIBA, 2008, BABOK, Release 2.0 (draft)

4/6/2009 IIBA - Pittsburgh Chapter 8


Verification

 Ensure the requirements have been defined


correctly.
 Determine that the requirements analysis has
been correctly performed.
 Determine the requirements provide all the
information needed to develop the solution.
 Determine that the requirements are ready
for formal review and validation by the
customer and users.
IIBA, 2008, BABOK, Release 2.0 (draft)

4/6/2009 IIBA - Pittsburgh Chapter 9


Why Verify?

Requirements that seem fine when you read


them in the requirements document might turn
out to have problems when developers try to
work with them.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 10


Validation

Determines whether the requirements:


 correctly align to the needs of the customer;

 are accurate;

 have the appropriate level of detail.

IIBA, 2008, BABOK, Release 2.0 (draft)

4/6/2009 IIBA - Pittsburgh Chapter 11


Differences

 Requirements verification focuses on


quality:
 Completeness,
 Correctness,
 Usability.
 Requirements validation focuses on:
 Supporting business goals/objectives,
 Aligning with business goals/objectives,
 Meeting stakeholder needs.
IIBA, 2008, BABOK, Release 2.0 (draft)

4/6/2009 IIBA - Pittsburgh Chapter 12


Differences

 Requirements verification is building the


thing right.
 Requirements validation is building the right
thing.

4/6/2009 IIBA - Pittsburgh Chapter 13


How do we do it?

4/6/2009 IIBA - Pittsburgh Chapter 14


Expected Outcome

Documented requirements that are:


 unique,

 well written,

 unambiguous.

They are “good quality” requirements.

IIBA, 2006, BABOK, Release 1.6

4/6/2009 IIBA - Pittsburgh Chapter 15


Good Quality Characteristics

 Complete  Prioritized
 Consistent  Traceable
 Correct  Unambiguous /
 Does not say “how” Understandable
 Feasible  Testable
 Mandatory/Necessary

IIBA, 2006, BABOK, Release 1.6

4/6/2009 IIBA - Pittsburgh Chapter 16


Verification

4/6/2009 IIBA - Pittsburgh Chapter 17


Verification

 Karl Wiegers has stated, “If a requirement


isn’t verifiable, determining whether it was
correctly implemented becomes a matter of
opinion, not objective analysis.”

4/6/2009 IIBA - Pittsburgh Chapter 18


Verification Process

 Select 1 or more techniques.


 Identify participants.
 Execute the selected technique(s).
 Update the requirements documentation.

Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 19


Verification Techniques

 Reviews
 Desk Check
 Pass Around
 Walkthrough
 Inspection
 Checklists
 Conceptual Testing

4/6/2009 IIBA - Pittsburgh Chapter 20


Verification - Reviews

4/6/2009 IIBA - Pittsburgh Chapter 21


Reviews

 The problem is that people interpret


requirements in different ways.
 To prevent this, you need to address 2
different issues:
 Do we all interpret the requirement the same
way?
 Are the requirements complete?

Cook, David A, 2002, “Verification and Validation of Requirements”, Presentation to USAF/STSC

4/6/2009 IIBA - Pittsburgh Chapter 22


Desk Check

 An informal process.
 A single colleague looks over the document
and models.
 Returns comments to the requirements
author.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 23


Pass Around

 An informal process.
 Several colleagues look over the document
and models.
 The colleagues work concurrently and
independently.
 Returns comments to the requirements
author.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 24


Walkthrough

 An informal process.
 A group of colleagues are assembled in a
meeting.
 Typically, the requirements author leads the
discussion.
 The requirements document is reviewed with
the assembled group and comments are
solicited.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 25


Inspection

 May be an informal or a formal process.


 Informal – requirements are inspected after
each JAD / Focus Group.
 Formal – the entire requirements document is
inspected after it has been written.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 26


Inspection

 Small teams are assembled to inspect the


requirements:
 Business Analyst
 Developer
 Tester
 Include testers to make sure the
requirements are verifiable and can serve as
the basis for System Testing.
 Team members inspect the document prior to
a formal meeting.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 27


Inspection

 Each requirement is read aloud in a paraphrased


form.
 Potential defects and issues are pointed out by
meeting participants.
 Defects and issues are formally recorded and
presented to the requirements author.
 Meeting Roles:
 Author
 Moderator
 Reader (an inspector)
 Recorder
 Inspectors
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 28


Checklists

 Author’s Pre-Review Checklist


 Requirements Checklist
 Pre-Signoff Checklist

4/6/2009 IIBA - Pittsburgh Chapter 29


Author’s Pre-Review Checklist

 The standard template was used for all requirements


documentation.
 The document is written in business language.
 All requirements have been prioritized.
 The documents were spell checked.
 The document layout is neat and orderly.
 Line numbering is active prior to printing.
 All model components and lines are labeled.
 Continuity among all models is intact:
 Elements mentioned in 1 model are mentioned in the others;
 Components referenced in multiple models use consistent terms.
 All open issues are marked as TBD.

4/6/2009 IIBA - Pittsburgh Chapter 30


Author’s Pre-Review Checklist

 All implied requirements (performance, security) are stated.


 All information to be displayed to the user are listed in the
requirements.
 All triggers and outcomes have been accounted for.
 All requirements are within Project Scope.
 Future issues (upgrades, planned migration, long term use) have
been addressed in the requirements.

4/6/2009 IIBA - Pittsburgh Chapter 31


Requirements Checklist
Organization and Completeness
 Are all internal cross-references to other requirements correct?
 Are all requirements written at a consistent and appropriate level of
detail?
 Do the requirements provide an adequate basis for design?
 Is the implementation priority of each requirement included?
 Are all external hardware, software, and communication interfaces
defined?
 Are algorithms intrinsic to the functional requirements defined?
 Does the SRS include all the known customer or system needs?
 Is any necessary information missing from a requirement? If so, is it
identified as a TBD?
 Is the expected behavior documented for all anticipated error
conditions?

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 32


Requirements Checklist
Correctness
 Do any requirements conflict with or duplicate other requirements?

 Is each requirement written in clear, concise, and unambiguous


language?
 Is each requirement verifiable by testing, demonstration, review, or
analysis?
 Is each requirement in scope for the project?

 Is each requirement free from content and grammatical errors?

 Can all the requirements be implemented within known constraints?

 Are all specified error messages unique and meaningful?

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 33


Requirements Checklist
Quality Attributes
 Are all performance objectives properly specified?

 Are all security and safety considerations properly specified?

 Are other pertinent quality attribute goals explicitly documented and


quantified, with the acceptable trade-offs specified?
Traceability
 Is each requirement uniquely and correctly identified?

 Is each software functional requirement traced to a high-level


requirement?

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 34


Requirements Checklist
Special Issues
 Are all requirements actually requirements, not design or
implementation solutions?
 Are the time-critical functions identified and their timing criteria
specified?
 Have internationalization issues been adequately addressed?

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 35


Pre-Signoff Checklist
 Are the requirements complete? Do they cover all of the new
features needed in this release?
 Are all the requirements SMART (specific, measurable, achievable,
relevant, time-bound)?
 Are all requirements actual requirements, not design or
implementation solutions?
 Are there any conflicts between some of the requirements?
 Are there duplicate requirements?
 Have all the requirements regarding the external hardware and
software and minimum system requirements been defined?
 Have all the issues regarding internationalization been properly
addressed?
 Have all non-functional requirements (quality attributes) been
properly specified?

Gantthead.com, 2007, Requirements Verification Checklist

4/6/2009 IIBA - Pittsburgh Chapter 36


Pre-Signoff Checklist

 Are there any requirements that specify how errors and/or


anticipated failure paths should be handled?
 Are the non-requirements clearly identified? Does the document
clearly indicate which proposed requirements were dropped during
the review process and are not part of this release?
 Are the requirements uniquely identified for traceability purposes?
 Are the requirements prioritized?
 Are all the internal and external references accurate?
 Is the document free from grammatical errors?
 Have all the TBDs been addressed?

Gantthead.com, 2007, Requirements Verification Checklist

4/6/2009 IIBA - Pittsburgh Chapter 37


Verification - Conceptual
Testing

4/6/2009 IIBA - Pittsburgh Chapter 38


Conceptual Testing

According to Boris Beizer, “The simple act of


designing test cases will reveal many problems
with the requirements even if you don’t execute
the tests on an operational system.”

Beizer, Boris, 1990, Software Testing Techniques, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 39


Conceptual Testing

Karl Wiegers states, “Testing and requirements


have a synergistic relationship because they
represent complementary views of the system.”

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 40


V Model
Acceptance Testing is associated with Requirements Development.

The V model is a
software development
model that is based on
the relationships between Verification Validation
each phase of the
development life cycle as
described in a typical
Waterfall model of
software development
and its associated phase
of testing.

4/6/2009 IIBA - Pittsburgh Chapter 41


Conceptual Testing

 Plan your testing activities and begin


developing preliminary test cases during the
corresponding development phase.
 Conceptual (implementation-independent)
test cases based on the requirements will
reveal errors, ambiguities and omissions in
your requirements and analysis models.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 42


Conceptual Testing

 You can begin deriving conceptual test cases from:


 Use Cases
 Requirements:
 Business
 User
 Functional
 Non-Functional
 You can use these test cases to evaluate textural
requirements, analysis models and prototypes.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 43


Conceptual Testing

 Writing test cases from the requirements:


 reveals ambiguities,
 reveals vagueness,
 verifies correctness.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 44


Conceptual Testing

 Writing black box test cases crystallizes your


vision of how the system should behave
under certain conditions.
 Vague and ambiguous requirements will jump
out at you because you won’t be able to
describe the expected system response.
 These conceptual (abstract) test cases are
independent of implementation.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 45


Validation

4/6/2009 IIBA - Pittsburgh Chapter 46


Validation

 Validation typically occurs twice:


 During requirements development, but after
verification,
 After coding.
 The best way to validate requirements is to
involve customers in the process.
 End-users are the most effective.

Cook, David A, 2002, “Verification and Validation of Requirements”, Presentation to USAF/STSC

4/6/2009 IIBA - Pittsburgh Chapter 47


Validation

Look for these documentation principles:


 Requirements must be realistic,

 Functionality must be defined,

 Behavior must be represented.

Hass, Kathleen, et.al., 2008, Getting It Right: Business Requirement Analysis Tools and Techniques

4/6/2009 IIBA - Pittsburgh Chapter 48


Validation Process

 Select 1 or more techniques.


 Identify participants.
 Execute the selected technique(s).
 Update the requirements documentation.

Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 49


Validation Techniques

 Reviews
 Walkthrough
 Inspection
 Checklists
 Conceptual Testing
 Acceptance Criteria
 Model Walkthrough
 Prototyping

4/6/2009 IIBA - Pittsburgh Chapter 50


Validation - Reviews

4/6/2009 IIBA - Pittsburgh Chapter 51


Walkthrough

 An informal process.
 A group of colleagues are assembled in a
meeting.
 The requirements document is reviewed with
the assembled group and comments are
solicited.
 Typically, the requirements author leads the
discussion.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 52


Inspection

 May be an informal or a formal process.


 Informal – requirements are inspected after
each JAD / Focus Group.
 Formal – the entire requirements document is
inspected after it has been written.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 53


Inspection

 Small teams are assembled to inspect the


requirements:
 Customer
 Subject Matter Experts
 Developer
 Include end-users to make sure the
requirements are verifiable and can serve as
the basis for User Acceptance Testing.
 Team members inspect the document prior to
a formal meeting.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 54


Inspection

 Each requirement is read aloud in a paraphrased


form.
 Potential defects and issues are pointed out by
meeting participants.
 Defects and issues are formally recorded and
presented to the requirements author.
 Meeting Roles:
 Author
 Moderator
 Reader (an inspector)
 Recorder
 Inspectors
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 55


Checklists

 Customer developed.
 Custom to the project.
 Lists of features and functions.

4/6/2009 IIBA - Pittsburgh Chapter 56


Checklists – General Characteristics
 All conditions under which a requirement applies are stated.
 Each requirement must accurately describe the functionality to
be built.
 The requirement can be tested to ensure that it meets measures
of success.
 The requirement is traceable to a goal stated in a project
initiating document, such as:
 Project Charter

 Business Case

 Project Scope

 The requirement is necessary to enable the solution to meet the


business goals and objectives.
 The requirement is traceable back to a specific customer or user.
 The requirement, as stated, has only one interpretation.
Hass, Kathleen, et.al., 2008, Getting It Right: Business Requirement Analysis Tools and Techniques

4/6/2009 IIBA - Pittsburgh Chapter 57


Validation – Conceptual
Testing

4/6/2009 IIBA - Pittsburgh Chapter 58


Conceptual Testing

Karl Wiegers states, “Testing and requirements


have a synergistic relationship because they
represent complementary views of the system.”

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 59


V Model
Acceptance Testing is associated with Requirements Development.

The V model is a
software development
model that is based on
the relationships between Verification Validation
each phase of the
development life cycle as
described in a typical
Waterfall model of
software development
and its associated phase
of testing.

4/6/2009 IIBA - Pittsburgh Chapter 60


Conceptual Testing

 You can begin deriving conceptual test cases


from:
 Use Cases
 Requirements:
 Business
 User
 Functional
 You can use these test cases to evaluate
textural requirements, analysis models and
prototypes.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 61


Acceptance Criteria

 “Acceptance criteria – and hence acceptance testing


– should evaluate whether the product satisfies its
documented requirements and whether it is fit for
use in the intended operating environment.”
 “It’s a shift in perspective from the requirements
elicitation question of ‘What do you need to do with
the system?’ to ‘How would you judge whether the
system satisfies your needs?’.”
 “If the customer can’t express how they would
evaluate the system’s satisfaction or a particular
requirement, that requirement is not stated
sufficiently clearly.”
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 62


Acceptance Criteria

 “Define the conditions for accepting the


system.”
 “Guides users to more explicitly describe how
they expect the software to work.”
 Identifies:
 Functionality (functional requirements),
 Quality attributes (non-functional requirements),
 Expected Results from a defined set of inputs.
 “How you will judge whether the system satisfies
your needs.”
Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 63


Acceptance Criteria

 User involvement is critical.


 Criteria are often based on:
 the user’s ability to accomplish specific tasks and,
 the system’s ability to meet certain quality
attributes.
 Every criterion should have 1 or more
Acceptance Test Cases.

Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 64


Acceptance Testing
 Sources:
 Use Cases,
 Requirements Analysis Models:
 Use Case Models,
 Business Process Models,
 Data Flow Models.
 Focus on anticipated usage scenarios.
 Consider the:
 most commonly used Use Cases (normal flow),
 most important Use Cases.
 De-emphasizes:
 Alternate Flows,
 Exception Condition Handling.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 65


Model Walkthrough

 Uses test simulations to step through


analysis models.
 Simulates system operation without actually
testing code.
 Why?
 Find missing steps,
 Find missing data,
 Find missing business rules.
Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 66


Model Walkthrough

 Identify and create test cases.


 Select the analysis model(s) to validate.
 Trace the test cases through the model(s) in
a step-by-step manner.
 Correct the requirements model(s), as
necessary.

Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 67


Validation – Prototyping

4/6/2009 IIBA - Pittsburgh Chapter 68


Prototyping - What

 A mock-up of a new system.


 Brings Use Cases to life.
 Closes gaps in understanding of
requirements.
 Helps the stakeholders arrive at a shared
understanding of the system’s requirements.

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 69


Prototyping - What

 Assesses the feasibility of quality attributes.


 Detects unnecessary functionality.
 Detects missing steps.
 Exercises interfaces.
 Reveals missing, erroneous and infeasible
requirements.

Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 70


Prototyping – How To

 Determine which requirements to validate


using a prototype.
 Develop the prototype.
 Evaluate the prototype.

Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger

4/6/2009 IIBA - Pittsburgh Chapter 71


Validation Techniques - Summary

 Reviews
 Walkthrough
 Inspection
 Checklists
 Conceptual Testing
 Acceptance Criteria
 Model Walkthrough
 Prototyping

4/6/2009 IIBA - Pittsburgh Chapter 72


Validation

 Writing test cases from the requirements:


 reveals ambiguities,
 reveals vagueness,
 verifies correctness.
 Executing prototypes:
 reveals ambiguities,
 reveals vagueness,
 verifies correctness.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 73


Validation

 Writing black box test cases crystallizes your


vision of how the system should behave
under certain conditions.
 Vague and ambiguous requirements will jump
out at you because you won’t be able to
describe the expected system response.
 These conceptual (abstract) test cases are
independent of implementation.
Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 74


Risks

4/6/2009 IIBA - Pittsburgh Chapter 75


Risks

 “Studies have shown that it can cost


approximately 100 times more to correct a
customer-reported requirement defect than to
correct an error found during requirements
development.”
 “Any measures you can take to detect errors
in the requirements specifications will save
you substantial time and money.”

Wiegers, Karl E., 2003, Software Requirements, 2nd Ed.

4/6/2009 IIBA - Pittsburgh Chapter 76


Summary

4/6/2009 IIBA - Pittsburgh Chapter 77


According to IIBA (1.6)

 Verification ensures that requirements are


defined clearly enough to allow solution
design and implementation to begin.
 Validation ensures that the stated
requirements correctly and fully implement
the business requirements.

IIBA, 2006, BABOK, Release 1.6

4/6/2009 IIBA - Pittsburgh Chapter 78


Differences

 Requirements verification is building the


thing right.
 Requirements validation is building the right
thing.

4/6/2009 IIBA - Pittsburgh Chapter 79


Questions?

4/6/2009 IIBA - Pittsburgh Chapter 80


References
 Cook, David A., 2002, “Verification & Validation of Requirements”, Software Technology
Support Center. Online at www.sstc-
online.org/Proceedings/2002/SpkrPDFS/ThrTracs/p961.pdf (Accessed December 2008).
 Gantthead.com, 2007, “Requirements Verification Checklist”. Online at
www.gantthead.com/checklists/Requirements-Verification-Checklist.html (Accessed
December 2008).
 Gottesdiener, Ellen, 2005, The Software Requirements Memory Jogger. Salem, NH:
GOAL/QPC.
 Hass, Kathleen B., Don J. Wessels, and Kevin Brennan, 2008, Getting It Right: Business
Requirement Analysis Tools and Techniques. Vienna, VA: Management Concepts, Inc.
 International Institute of Business Analysis, 2006, A Guide to the Business Analysis Body
of Knowledge, Version 1.6. Toronto, ON. International Institute of Business Analysis.
 International Institute of Business Analysis, 2008, The Guide to the Business Analysis
Body of Knowledge, Version 2.0 – Draft for Public Review. Toronto, ON. International
Institute of Business Analysis.
 Wahab, Sammy, “Can I Have My Requirements and Test Them Too?”, Business Analyst
Times. Online at www.batimes.com/content/view/321/1/ (Accessed December 2008).
 Wiegers, Karl E., 2003, Software Requirements, Second Edition. Redmond, WA: Microsoft
Press.

4/6/2009 IIBA - Pittsburgh Chapter 81