Você está na página 1de 25

c   


 
    

  


 
  
‡ c   c
 

‡    
c      

‡    

‡   

‡   

‡    
c   c
  
"A data warehouse is a subject oriented, integrated, time variant, non volatile
collection of data in support of management's decision making process".
   
c      

‡ Data selection from multiple source systems and analysis that follows pose great
challenge.

‡ Volume and the complexity of the data.

‡ Inconsistent and redundant data in a data warehouse.

‡ Loss of data during the ETL process.

‡ Non-Availability of comprehensive test bed

‡ Critical Data for Business.


   

uur main aim is to check the quality of that data.

‡ Data completeness. Ensures that all expected data is loaded.

‡ Data transformation. Ensures that all data is transformed correctly


according to business rules and/or design specifications.

‡ Data quality. Ensures that the ETL application correctly rejects,


substitutes default values, corrects or ignores and reports invalid data.
   

To perform testing better

‡ Use of Traceability matrix to enable full test coverage of Business Requirements.

‡ In depth review of Test Cases.

‡ Manipulation of Test Data to ensure full test coverage.

‡ Provision of appropriate tools to speed the process of Test Execution & Evaluation.

‡ Regression Testing
   

The following are types of Testing performed for Data warehousing projects.

‡ Unit Testing

‡ Integration Testing

‡ Technical Shakedown Testing

‡ System Testing

‡ uperation readiness Testing

‡ User Acceptance Testing

‡ Regression Testing
      

Its major objective is to verify the


data produced and validate the design
a 

‡ Implementation Checklist for move from development to test.

‡ All unit testing completed and summarized.

‡ Migration to the test environment from the development environment.

‡ Data available in the test environment.

u   

‡ Validate the business requirements, functional requirements

‡ Validate the data for correct business rules that correct number of rows are
transferred and verify load volumes.

‡ Ensure mapping order is correct and dependencies among workflows are in place.
‡ Validate target tables are populated with correct number of records.

‡ To Check for Error log messages in appropriate file.

‡ To check for restarting of Jobs in case of failures.

‡ Validate the execution of workflows and data at the following stages


Source to Staging .
Staging to uDS.
uDS to Data Mart

‡ Verify integration of new mappings with existing mappings.

‡ Validate proper functionality of mapping variables and parameter files.

‡ Testing the individual mappings to verify the transformations and also at the workflow
level.
  

‡ Project Plan,Business requirements document

‡ Test cases and steps

‡ Access to personal files on the network

‡ Executed and approved unit test cases or peer review reports

‡ Source to Target Matrices(STM)

‡ Extract and Load urder document

G  The project manager is responsible for ensuring all the input criteria are completed
by the appropriate project team member as defined in the project Deliverables Matrix
prior to each phase of testing
0    

‡ Integration testing is performed in the test environment.

 
‡ Data access tools (e.g., TuAD, PL/SQL) are used to analyze content of tables and to
analyze results of loads.

‡ ETL Tools(e.g. Informatica,Datastage).

‡ Test management tool(e.g. Test Director ,QC) that maintains and tracks the
requirements, test cases, defects and traceability matrix.
c   

‡ Executed Integration Test Case documents, i.e., documented actual results against
each test, signed and dated by the tester(s).

‡ Signed and approved Test Case Index & Results document which contains results of
executed Integration test scripts.

‡ Updated Requirements Traceability Matrix


      
The DW&BI team should use the Test Case Index and Results document to report result of
testing. The document tracks the following

‡ „ Enter a test case number in sequential outline format (e.g., 1, 1.1, 2, 2.1, 3).

‡ 
 : Provide a brief description that covers each test case instance as fully as
possible.

‡   
  List each requirement number that corresponds to the
listed test case number and briefly describe.

‡  
  Provide a relative criticality ranking for each test case instance (Low, Medium,
High).

‡ : Indicate each test case result (Pass [test case meets acceptable criteria], Fail [test
case does not meet acceptable criteria], Hold [test case requires additional data for result
to be determined].)

‡   
  
: For each failed test case, list the assigned Software
Process Report (SPR) #, briefly describe what caused the failure.

‡   
G: Indicate the assigned SQL script name, as applicable.
     

A Technical Shakedown Test will be conducted prior to System Testing

u
 

‡ Software has been configured correctly (including Informatica architecture, Source


system connectivity and Business ubjects).

‡ All the code has been migrated to the QA environments correctly.

‡ All required connectivity between systems are in place.


   

System Testing is performed to prove that the system meets


the Functional Specifications from an end to end perspective.

The testing team will verify that the data in the source system databases and
the data in the Target is consistent through out the process
a 
‡ Finalized Implementation Checklist

‡ All integration testing should be completed

‡ Migration from the Test environment to the QA environment, as applicable

‡ Production configuration and data available

 
‡ Project Plan,Business requirements document

‡ System Test Cases and steps

‡ Updated uperations Manual

‡ Signed and approved integration Test Case Index, Test Case documents, and scripts
u   

‡ Verify the QA environment is an exact replica of Production prior to running the


system test

‡ Run end-to-end system test starting from the source databases to target and verify
the data output.

‡ Record initialization and incremental load statistics

‡ Verify functionality of the system meets the business specifications

‡ Verify error handling and reconciliation processes are functioning properly


0    
‡ System testing is performed in the QA environment

 
‡ Data access tools (e.g., TuAD, PL/SQL) are used to analyze content of tables and to
analyze results of loads.

‡ ETL Tools(e.g. Informatica,Datastage).

‡ Test management tool(e.g. Test Director ,QC) that maintains and tracks the requirements,
test cases, defects and traceability matrix

c  
‡ Production replicated data
c   

‡ Executed System Test Cases, i.e., documented actual results against each test,
signed and dated by the tester(s)

‡ Signed and approved Test Case Index & Results document which contains results of
executed system test scripts

‡ Requirements Traceability Matrix

‡ A summary report
º     

‡ The objective of this testing to ensure that System meets the expectations of the
business users.

‡ It aims to prove that the entire system operates effectively in a production


environment and that the system successfully supports the business processes from
a user's perspective.

‡ The tests will also include functions that involve source systems connectivity, jobs
scheduling and Business reports functionality.
u        !u"

This is the final phase of testing which focuses on verifying the deployment of software
and the operational readiness of the application.

c  

‡ Tests the deployment of the solution .

‡ Tests overall technical deployment ³checklist´ and timeframes .

‡ Tests the security aspects of the system including user authentication and
authorization, and user-access levels.

‡ Tests the operability of the system including job control and scheduling
    

‡ Performed after a defect reported is fixed by the developer.

‡ Performed when a Change Request is implemented on an existing production


system.

  

‡ Impact analysis workbook prepared by the developer

‡ STM developed by the Business Analyst as for designing the test cases for the CR.

‡ Test Result Report of SIT ,if Change Request is implemented on an existing


production system.
    

‡ Reaching deadlines, e.g.: release deadlines, testing deadlines

‡ Test Cases completed with certain percentage passed

‡ Test budget has been depleted

‡ Coverage of code or requirements reaches a specified point

‡ Defects rate falls below a certain level

‡ Testing stops when the result is unproductive (No. of errors per person per day
reduces)

 

a     

Você também pode gostar