Você está na página 1de 56

Software Testing

By.
Prof. K. Adisesha BE, MSc, M Tech, NET
2 Main Contents

Software Processes
What is software testing
Testing Level
Testing Strategies
Activities of software testing

Prof. K. Adisesha BE, MSc, M Tech, NET


3 The software process
A structured set of activities required to develop a
software system
Specification;
Design;
Validation;
Evolution.
A software process model is an abstract representation of
a process. It presents a description of a process from some
particular perspective.

Prof. K. Adisesha BE, MSc, M Tech, NET


Generic software process models
4

The waterfall model


Separate and distinct phases of specification and development.
Evolutionary development
Specification, development and validation are interleaved.
Component-based software engineering
The system is assembled from existing components.
There are many variants of these models e.g. formal
development where a waterfall-like process is used but the
specification is a formal specification that is refined through
several stages to an implementable design.

Prof. K. Adisesha BE, MSc, M Tech, NET


5 Waterfall model

Prof. K. Adisesha BE, MSc, M Tech, NET


6 Process activities
Software specification
Software design and implementation
Software validation
Software evolution

Prof. K. Adisesha BE, MSc, M Tech, NET


7 Software specification
The process of establishing what services are required
and the constraints on the systems operation and
development.
Requirements engineering process
Feasibility study;
Requirements elicitation and analysis;
Requirements specification (MS3);
Requirements verification (IA1, TA);
Requirements validation (Acceptance test)

Prof. K. Adisesha BE, MSc, M Tech, NET


8 The requirements engineering process

Prof. K. Adisesha BE, MSc, M Tech, NET


9 What is software testing
---Software Correctness
Software Correctness
A program P is considered with respect to a specification S, if and
only if:
For each valid input, the output of P is in accordance with the
specification S

Software is never correct no matter which developing


technique is used
Any software must be validated and verified
Verification and Validation

Prof. K. Adisesha BE, MSc, M Tech, NET


10 Verification vs. validation
Verification: "Are we building the product right"
The software should conform to its specification
Validation: "Are we building the right product"
The software should do what the user really requires
V & V must be applied at each stage in the software
process
Two principal objectives
Discovery of defects in a system
Assessment of whether the system is usable in an operational
situation
Prof. K. Adisesha BE, MSc, M Tech, NET
11 Static and dynamic verification
STATIC Software inspections
Concerned with analysis of the static system representation to
discover problems
May be supplement by tool-based document and code analysis
DYNAMIC Software testing
Concerned with exercising and observing product behaviour
The system is executed with test data and its operational
behaviour is observed

Prof. K. Adisesha BE, MSc, M Tech, NET


12 Static and dynamic V&V

Static
verification

Requirements High-level Formal Detailed


specification Program
specification design design

Dynamic
Prototype
validation

Prof. K. Adisesha BE, MSc, M Tech, NET


13 Program testing
Can reveal the presence of errors, not their
absence
A successful test is a test which discovers one
or more errors
The only validation technique for non-functional
requirements
Should be used in conjunction with static
verification to provide full V&V coverage

Prof. K. Adisesha BE, MSc, M Tech, NET


14 Types of testing
Defect testing
Tests designed to discover system defects.
A successful defect test is one which reveals the presence
of defects in a system.
Statistical testing
Tests designed to reflect the frequency of user inputs
Used for reliability estimation

Prof. K. Adisesha BE, MSc, M Tech, NET


15 V & V goals
Verification and validation should establish confidence that
the software is fit for purpose
This does not mean completely free of defects
Rather, it must be good enough for its intended use
The type of use will determine the degree of confidence that is
needed

Prof. K. Adisesha BE, MSc, M Tech, NET


16 Testing and debugging
Testing
It involves the identification of bug/error/defect in the software without
correcting it..
Normally professionals with a Quality Assurance background are involved in the
identification of bugs.
Testing is performed in the testing phase

Debugging

It involves identifying, isolating and fixing the problems/bug. .


Developers who code the software conduct debugging upon encountering an
error in the code.
Debugging is the part of White box or Unit Testing.
Debugging can be performed in the development phase while conducting Unit
Testing or in phases while fixing the reported bugs.

Prof. K. Adisesha BE, MSc, M Tech, NET


17 V & V planning
Careful planning is required to get the most out of testing
and inspection processes
Planning should start early in the development process
The plan should identify the balance between static
verification and testing
Test planning is about defining standards for the testing
process rather than describing product tests

Prof. K. Adisesha BE, MSc, M Tech, NET


18 The V-model of development

Requir ements System System Detailed


specification specification design design

System Sub-system Module and


Acceptance
integration integration unit code
test plan
test plan test plan and tess

Acceptance System Sub-system


Service
test integration test integration test

Prof. K. Adisesha BE, MSc, M Tech, NET


19 The structure of a software test plan
The testing process
Requirements traceability
Tested items
Testing schedule
Test recording procedures
Hardware and software requirements
Constraints

Prof. K. Adisesha BE, MSc, M Tech, NET


Testing Myths..!!
Testing is too expensive.
Testing is time consuming.
Testing cannot be started if the product is not fully developed
Complete Testing is Possible. .
If the software is tested then it must be bug free.
Missed defects are due to Testers.
Testers should be responsible for the quality of a product.
Test Automation should be used wherever it is possible to use it
and to reduce time.
Any one can test a Software application
A testers task is only to find bugs
Testing Types
Manual Testing

Testing of the Software manually i.e. without using any automated tool or any script.

Different stages for manual testing like unit testing, Integration testing, System testing and
User Acceptance testing.

Testers use test plan, test cases or test scenarios to test the Software to ensure the
completeness of testing.

Automation Testing

Tester writes scripts and uses another software to test the software .

This process involves automation Test Automation of a manual process.

Automation Testing is used to re-run the test scenarios that were performed manually,
quickly and repeatedly..

Increases the test coverage; improve accuracy, saves time and money in comparison to
manual testing.
Testing Methods
Black Box Testing

A tester will interact with the systems user interface by providing inputs and examining
outputs without knowing how and where the inputs are worked upon.

Focuses on functional requirement of the software

White Box Testing

White box testing is the detailed investigation of internal logic and structure of the code.

Also called glass testing or open box testing.

The tester needs to have a look inside the source code and find out which unit/chunk of the
code is behaving inappropriately

Grey Box Testing

Its a technique to test the application with limited knowledge of the internal workings of an
application. In software testing

"The more you know the better carries a lot of weight when testing an application.
Comparison between the Three Testing Types

Black Box Testing Grey Box Testing White Box Testing

The Internal Workings of an application are not Somewhat knowledge of the internal workings Tester has full knowledge of the Internal
required to be known are known workings of the application

Performed by end users and also by testers and Performed by end users and also by testers and
developers developers Normally done by testers and developers

Testing is based on external expectations. Testing is done on the basis of high level Internal workings are fully known and the
Internal behavior of the application is unknown database diagrams and data flow diagrams tester can design test data accordingly

Another term for grey box testing is translucent


Also known as closed box testing, data driven testing as the tester has limited knowledge of Also known as clear box testing, structural
testing and functional testing the insides of the application testing or code based testing
Levels of Testing
Functional Testing
Unit Testing.

Integration Testing
I. Bottom-Up Integration
II. Top-Down Integration

System Testing

System Integration Testing

Regression Testing

Acceptance Testing
I. Alpha Testing
II. Beta Testing

Non-Functional Testing
Performance Testing
I. Load Testing
II. Stress Testing
Usability Testing

Security Testing

Portability Testing
Unit Testing
Algorithms and logic
Data structures (global and local)
Interfaces
Independent paths
Boundary conditions
Error handling
Why Integration Testing Is Necessary

One module can have an adverse effect on another


Subfunctions, when combined, may not produce the
desired major function
Individually acceptable imprecision in calculations may
be magnified to unacceptable levels
Why Integration Testing Is Necessary
(contd)
Interfacing errors not detected in unit testing may
appear
Timing problems (in real-time systems) are not
detectable by unit testing
Resource contention problems are not detectable by
unit testing
Top-Down Integration
1. The main control module is used as a driver, and stubs
are substituted for all modules directly subordinate to
the main module.
2. Depending on the integration approach selected
(depth or breadth first), subordinate stubs are replaced
by modules one at a time.
3. Tests are run as each individual module is integrated.
4. On the successful completion of a set of tests, another
stub is replaced with a real module
5. Regression testing is performed to ensure that errors
have not developed as result of integrating new
modules
Top-down Testing

T1
T1 A

A
T2
T2 B

B T3
T3 C
T4
Problems with Top-Down Integration

Many times, calculations are performed in the modules at


the bottom of the hierarchy
Stubs typically do not pass data up to the higher modules
Delaying testing until lower-level modules are ready
usually results in integrating many modules at the same
time rather than one at a time
Developing stubs that can pass data up is almost as
much work as developing the actual module
Bottom-Up Integration
Integration begins with the lowest-level modules, which are
combined into clusters, or builds, that perform a specific
software subfunction
Drivers (control programs developed as stubs) are written
to coordinate test case input and output
The cluster is tested
Drivers are removed and clusters are combined moving
upward in the program structure
Bottom-up testing

Test Test Test


Drivers Drivers Drivers

Level N Level N Level N

Test Test
Drivers Drivers

Level N-1 Level N-1


Problems with Bottom-Up Integration
The whole program does not exist until the last module
is integrated
Timing and resource contention problems are not found
until late in the process
Validation Testing
Determine if the software meets all of the requirements
defined in the SRS
Having written requirements is essential
Regression testing is performed to determine if the
software still meets all of its requirements in light of
changes and modifications to the software
Regression testing involves selectively repeating existing
validation tests, not developing new tests
Alpha and Beta Testing
Its best to provide customers with an outline of the
things that you would like them to focus on and specific
test scenarios for them to execute.
Provide with customers who are actively involved with a
commitment to fix defects that they discover.
Acceptance Testing
Similar to validation testing except that customers are
present or directly involved.
Usually the tests are developed by the customer
Test Methods
White box or glass box testing
Black box testing
Top-down and bottom-up for performing incremental
integration
ALAC (Act-like-a-customer)
Test Types
Functional tests
Algorithmic tests
Positive tests
Negative tests
Usability tests
Boundary tests
Startup/shutdown tests
Platform tests
Load/stress tests
Testing Documents

Testing Strategy

Test Plan

Test Scenario

Test Case

Traceability Matrix
Testing Documents - Testing Strategy
Test strategy is statement of overall approach of testing to meet the business
and test objectives

It is a plan level document and has to be prepared in the requirement stage


of the project.

It identifies the methods, techniques and tools to be used for testing

Components of the Test Strategy document:

I. Scope , Objectives and critical success factors


II. Overall testing approach, Management and other business initiative
III. Test Environment
IV. Test Automation requirements
V. Testing measurements and metrics
VI. Defect reporting and tracking
VII.Project Policies
VIII.Risk Identification, Mitigation and Contingency plan
IX. Test deliverability
X. Specific Document templates used in testing
XI. Change and configuration management
XII.Training plan
Testing Documents - Test Plan
Test Plan is derived from the Product Description, Software Requirement Specification SRS, or Use Case
Documents.

Focus of the document is to describe what to test, how to test, when to test and who will do what test.

The Test Plan document is usually prepared by the Test Lead or Test Manager
Components of the Test Plan document:

I. Scope : Features to be Tested and features not to be tested


II. Test Approach
Test Case identification/creation/modifications
Test Data Preparation
Test Environment Entrance and Exit Criteria
Test Complete and Validations
Test Reporting
III. Testing Objectives and Goals
IV. Test Phases : Functional, Regression, Integration and Business Acceptance Testing
V. Test Environment
VI. Test Schedules
VII.Resource Requirements
VIII.Roles & Responsibilities
IX. Project Impact Analysis : Risks, Assumptions and Constraints
X. Test Management and Reporting
XI. Defect Life Cycle
XII.Test Execution Criteria : Suspension, Resumption and Approval Criteria
XIII.Test Deliverables.
Testing Documents - Test Scenario

Test scenarios are the high level classification of test requirement grouped depending on
the functionality of a module

Test Scenario represents a series of actions that are associated together.

A test scenario may have one or more test cases associated to it


Testing Documents - Test Case
Test cases are the set of valid and invalid executable procedure of a test
scenario.
A test case with valid functionality is called positive test case
A test case with invalid functionality is called negative test case.
Test scenario ensures that all business flow has been covered and tested
end to end while Test case ensure that a single step has been covered
well

Typical Test Case format


I. Test Case ID
II. Prerequisite(Test Data)
III.Test case Description
IV.Expected Result
V. Actual Result
VI.Pass/Fail
VII.Comments
Testing Documents - Traceability Matrix
Testing Requirements Traceability Matrix is a document that traces and maps user requirements,
usually requirement IDs from a requirement specification document, with the test case IDs.

Purpose of this document is to make sure that all the requirements are covered in test cases so
that nothing is missed

Types of Traceability Matrices


I. Forward Traceability: Mapping of Requirements to Test Cases.
II. Backward Traceability: Mapping of Test Cases to Requirements.
III. Bi-Directional Traceability: Contains both Forward and Backward Traceability

Basic Traceability Matrix format:


45 Software inspections
Involve people examining the source representation with
the aim of discovering anomalies and defects
Do not require execution of a system
May be used before implementation
May be applied to any representation of the system
Requirements, design, test data, etc.
Very effective technique for discovering errors
Many different defects may be discovered in a single
inspection
In testing, one defect may mask another so several executions are required
Reuse of domain and programming knowledge
Reviewers are likely to have seen the types of error that commonly arise
Prof. K. Adisesha BE, MSc, M Tech, NET
46 Inspections and testing
Inspections and testing are complementary and not
opposing verification techniques
Both should be used during the V & V process
Inspections can check conformance with a specification
but not conformance with the customers real requirements
Inspections cannot check non-functional characteristics
such as performance, usability, etc.

Prof. K. Adisesha BE, MSc, M Tech, NET


47 Program inspections
Formalised approach to document reviews
Intended explicitly for defect DETECTION (not
correction)
Defects may be
logical errors
anomalies in the code that might indicate an erroneous
condition (e.g. an uninitialized variable)
non-compliance with standards

Prof. K. Adisesha BE, MSc, M Tech, NET


48 Inspection checklists
Checklist of common errors should be used to
drive the inspection
Error checklist is programming language
dependent
The 'weaker' the type checking, the larger the
checklist
Examples
Initialisation
Constant naming
Loop termination
Array bounds

Prof. K. Adisesha BE, MSc, M Tech, NET


49 Static analysis checks
Fault class Static analysis check
Data faults Variables used before initialisation
Variables declared but never used
Variables assigned twice but never used
between assignments
Possible array bound violations
Undeclared variables
Control faults Unreachable code
Unconditional branches into loops
Input/output faults Variables output twice with no intervening
assignment
Interface faults Parameter type mismatches
Parameter number mismatches
Non-usage of the results of functions
Uncalled functions and procedures
Storage management Unassigned pointers
faults Pointer arithmetic
Prof. K. Adisesha BE, MSc, M Tech, NET
50 Stages of static analysis
Control flow analysis
Checks for loops with multiple exit or entry points, finds unreachable
code, etc.
Data use analysis
Detects uninitialized variables, variables written twice without an
intervening assignment, variables which are declared but never used,
etc.
Interface analysis
Checks the consistency of routine and procedure declarations and
their use

Prof. K. Adisesha BE, MSc, M Tech, NET


51 Stages of static analysis (2)
Information flow analysis
Identifies the dependencies of output variables
Does not detect anomalies, but highlights information for code
inspection or review
Path analysis
Identifies paths through the program and sets out the statements
executed in that path
Also potentially useful in the review process
Both these stages generate vast amounts of information
Handle with caution!

Prof. K. Adisesha BE, MSc, M Tech, NET


52 Cleanroom software development
The name is derived from the 'Cleanroom'
process in semiconductor fabrication. The
philosophy is defect avoidance rather than
defect removal
Software development process based on:
Incremental development
Formal specification
Static verification using correctness arguments
Statistical testing to determine program reliability

Prof. K. Adisesha BE, MSc, M Tech, NET


53 The Cleanroom process

Formally Error rework


specify
system

Define Construct Formally


Integrate
software structured verify
increment
increments program code

Develop
operational Design Test
profile statistical integrated
tests system

Prof. K. Adisesha BE, MSc, M Tech, NET


54 Cleanroom process teams
Specification team
Responsible for developing and maintaining the system specification
Development team
Responsible for developing and verifying the software
The software is not executed or even compiled during this process
Certification team
Responsible for developing a set of statistical tests to exercise the
software after development
Reliability models are used to determine when reliability is acceptable

Prof. K. Adisesha BE, MSc, M Tech, NET


55 Cleanroom process evaluation
Results in IBM have been very impressive with
few discovered faults in delivered systems
Independent assessment shows that the
process is no more expensive than other
approaches
Fewer errors than in a 'traditional' development
process
Not clear how this approach can be transferred
to an environment with less skilled or less
highly motivated engineers

Prof. K. Adisesha BE, MSc, M Tech, NET


Thank You
56

Queries

Prof. K. Adisesha BE, MSc, M Tech, NET

Você também pode gostar