Você está na página 1de 33

2

Y 
 
 
©

@  
— Testing
— System Integration Testing
— Integration testing
Big bang
Bottom up
Top down
Sandwich
— System testing
Functional
Performance
— Acceptance testing
— Case study
a

9 
hat is the overall goal of testing? hat claims can we make
when testing "passes" or "fails" ? Can we prove that our code
has no bugs?
 9 ½ systematic approach to find the errors in system (to
"falsify" system)
 accomplished by exercising defects in the system and revealing
problems
 failed test½ an error was demonstrated

 passed test½ no error was found, so far

 not used to show absence of errors in software

 does not directly reveal the actual bugs in the code


î

Y  
  
— (S.I.T.) is the testing of the
sub-systems, as a whole, to
ensure that they work as
a system.
— verifies the proper
execution of software
components and proper
interfacing between
components within the
solution.
— For eg. In case of lab
thermostat changes in the
environment can affect
the performance of a
system.
Œ

 
   0

— The entire system is viewed as a collection of subsystems (sets


of classes) determined during the system and object design.

— Goal½ Test all interfaces between subsystems and the


interaction of subsystems.

— The Integration testing strategy determines the order in which


the subsystems are selected for testing and integration.
£

Wh    
  
— Ànit tests only test the unit in isolation

— Many failures result from faults in the interaction of subsystems

— Often many Off-the-shelf components are used that cannot be


unit tested

— ithout integration testing the system test will be very time


consuming

— Failures that are not discovered in integration testing will be


discovered after the system is deployed and can be very
expensive.

Y
  

— èriver½
A component, that calls the TestedÀnit è 
Controls the test cases

— Stub½
 
A component, the TestedÀnit
depends on
Partial implementation


Returns fake values.
D

`

  (Spreadsheet)


  


 i  

  è
 
è    
   
   è i  
   

  
  i    i  
  

 
  è   
A


 9 0 
Common approaches to perform integration testing½
— Top-down Ô  
 å   
  
— Bottom-up Ô   
 å       
— Sandwich Ô        å   
  
  
— Big-bang Ô     

6  
A module must be available to be integrated
A module is said to 
   for combining with other
modules when the module¶s     is ready
2

9  9 Y




— Test the top layer or the controlling subsystem first


— Then combine all the subsystems that are called by the tested
subsystems and test the resulting collection of subsystems
— èo this until all subsystems are incorporated into the test
— Special program is needed to do the testing, 9  
A program or a method that simulates the activity of a
missing subsystem by answering to the calling sequence of
the calling subsystem and returning back fake data.
22


9  
 
  è

  


 è è


i   i   i  

6 
     
 
 
— Test cases can be defined in terms of the functionality of the
system (functional requirements)
— riting stubs can be difficult½ Stubs must allow all possible
conditions to be tested.
— Possibly a very large number of stubs may be required,
especially if the lowest level of the system contains many
methods.
— One solution to avoid too many stubs½ x   
    
Test each layer of the system decomposition individually
before merging the layers
èisadvantage of modified top-down testing½ Both, stubs and
drivers are needed
2a

g  9 Y


— The subsystems in the lowest layer of the call hierarchy are


tested individually.
— Then the next subsystems are tested that call the previously
tested subsystems.
— This is repeated until all subsystems are included.
— èrivers are needed.


g  
 
  è


  




 
è


 è

6 
  g  
 
9 
— Con½
Tests the most important subsystem (user interface) last.
èrivers needed.
— Pro½
o stubs needed
Àseful for integration testing of the following systems½
x Object-oriented systems
x Real-time systems
x Systems with strict performance requirements.

Y
 9 Y

— Combines top-down strategy with bottom-up strategy
— 9   
 
   
A target layer in the middle
A layer above the target
A layer below the target
Testing converges at the target layer
— How do you select the target layer if there are more than 3
layers?
Heuristic½ Try to minimize the number of stubs and drivers
2


Y
 9 Y

  è


  
è



 è
 

è

2D

6 
  Y
 9 

— Top and Bottom Layer Tests can be done in parallel

— èoes not test the individual subsystems thoroughly before


integration

— Solution½ Modified sandwich testing strategy


2A

g g

— Bring all components together all at once. All
interfaces tested in one go.
— ` 
 
o all components have passed unit testing
— ` 
 
o Test suite passes
©


g g


  è



  


è è






©2

Y 
 9 0
2 Based on the integration î. èo      èefine
strategy,     to test cases that exercise the
be tested. Ànit test all the selected component
" classes in the component. 5. Execute    
2. Put selected component 6.  of the test cases
together; do any    and testing activities.
necessary to make the 7. Repeat steps 1 to 7 until the
integration test operational full system is tested.
(drivers, stubs)
3. èo        èefine The primary      
test cases that exercise all uses        in
cases with the selected the (current) component
component configuration.
©©

Y 9  0

— Functional Testing
Validates functional requirements
— Performance Testing
Validates non-functional requirements
— Acceptance Testing
Validates clients expectations
©a

 
9 
Goal½ Test functionality of system
— Test cases are designed from the requirements analysis
document (better½ user manual) and centered around
requirements and key functions (use cases).
— The system
" is treated as black box.
— Ànit test cases can be reused, but new test cases have to
be developed as well.
©î

6  
9 

Goal½ Try to violate non-functional requirements

— Test how the system behaves when overloaded.


— Try unusual orders of execution
— Check the system¶s response to large volumes of data
— hat is the amount of time spent in different use
cases?
©Œ

9  6  
9 
— Stress Testing — Security testing
Stress limits of system Try to violate security
requirements
— Volume testing
Test what happens if large
— Environmental test
amounts of data are handled Test tolerances for heat, humidity,
motion
— Configuration testing
Test the various software and
— Quality testing
hardware configurations Test reliability, maintain- ability &
availability
— Compatibility test
Test backward compatibility with
— Recovery testing
existing systems Test system¶s response to presence
of errors or loss of data
— Timing testing
Evaluate response times and time
— Human factors testing
to perform a function Test with end users.
©£


9 
— Goal½ èemonstrate system is — Alpha test½
ready for operational use Client uses the software at
the developer¶s
Choice of tests is made by environment.
client Software used in a
Many tests can be taken controlled setting, with the
from integration testing developer always ready to
Acceptance test is fix bugs.
performed by the client, not — Beta test½
by the developer. Conducted at client¶s
environment (developer is
not present)
Software gets a realistic
workout in target environ-
ment
©

99 Y0 
2 Select what has to be tested  èevelop test cases
Analysis½ Completeness of A test case is a set of test
requirements data or situations that will
èesign½ Cohesion be used to exercise the unit
Implementation½ Source code (class, subsystem, system)
2. èecide how the testing is done being tested or about the
Review or code inspection attribute being measured
Proofs (èesign by Contract) 4. Create the test oracle
Black-box, white box, An oracle contains the
Select integration testing predicted results for a set of
strategy (big bang, bottom up, test cases
top down, sandwich)
The test oracle has to be
written down before the
actual testing takes place.
©D


Y 0

Ë 
 


©A

6     
hen Journyx originally released their software solution, Journyx Timesheet,
features were added week-in and week-out without any serious testing
considerations. Problems are½

1. The flow of bad features adding to their products without understanding the
impact of those features .

2. By regression testing ,the system slows down the new version to release
and sometimes the Customers finding severe defects.

3. They hire an expert to document their Software QA testing procedures and


for developing manual test scripts , but still finds difficulty in getting desired
results.
a

Y  
— The company then hires an expert team from TESTCo , which gives them
suggestions to improve the Software QA testing.

— TESTCo expanded and standardized Journyx¶s software QA testing


coverage which also helps in speeding up their development process and
anyone with the minimal training can run a test.

— TESTCo also automated a large portion of test coverage due to the number
of platforms used by their customers.

— èue to the automated testing coverage, daily basis reports of continuous


regression testing and defects can be generated which results in easier
finding the desired results and also ease the development of processes.
a2

ÿ
å à     
> Cutting of testing cycles from 96 business days to 32 business
days.
> ew platforms can be easily added and tested within 12 hrs
which helps in future development.
> By doing regression testing, they find zero critical defects
when adding new features.
å à     
> They¶ve dodged a lot of hidden costs.
> Any person can easily be trained by TESTCo¶s team by paying
them within three months for the future improvement of
company.

ÿ 

i. http½www.exforsys.comtutorialstestingintegration-testing-
whywhathow.html
ii. http½www.exforsys.comtutorialstestingsystem-testing-
whywhathow.html
iii. http½softwaretestingguide.blogspot.com
iv. http½www.softwaretestingstuff.com200812integration-
testing-four-step-procedure.html
v. http½forum.onestoptesting.comforum_posts.asp?TIè1494
vi. http½www.testco.comcase-studiesjournyx2-case-
study_en.html
aa


@