Você está na página 1de 17

AUTOMATION ARCHITECTURE TABLE OF CONTENTS 1 INTRODUCTION............................................................................................... 3 2 TEST AUTOMATION ARCHITECTURE...........................................................4 2.1 QTP TEST SCRIPTS.......................................................................................... 5 2.2 ACCESS DATABASE.............................................................................................

5 2.3 OBJECT REPOSITORY..........................................................................................5 2.4 LIBRARY...........................................................................................................5 2.5 TEST DIRECTOR.................................................................................................6 2.6 REPORTS......................................................................................................... 6 3 QUALITY FACTORS........................................................................................6 3.1 MAINTAINABILITY................................................................................................ 6 3.1.1 Data Sensitive.........................................................................................6 3.1.2 Behavior Sensitive...................................................................................7 3.1.3 User Interface Sensitive..........................................................................7 3.2 REUSABILITY..................................................................................................... 7 3.3 SIMPLICITY........................................................................................................7 3.4 PORTABILITY..................................................................................................... 8 3.5 RELIABILITY.......................................................................................................8 3.6 USABILITY.........................................................................................................8 3.7 PERFORMANCE.................................................................................................. 8 4 DESIGN TIME ENVIRONMENT........................................................................9 4.1 FOLDER STRUCTURE .........................................................................................10 5 RUNTIME ENVIRONMENT............................................................................. 11 6 TEST SCRIPT LOGICAL FLOW..................................................................... 11 7 CODING CONVENTIONS............................................................................... 12 7.1 VARIABLE NAMING CONVENTIONS.........................................................................12 7.2 CONSTANT NAMING CONVENTIONS.......................................................................12 7.3 AVOID USAGE OF EXCESSIVE WAIT STATEMENTS........................................................12 7.4 USE WITH AND END WITH STATEMENT.................................................................13 7.5 USE OPTION EXPLICIT AT THE BEGINNING OF THE TEST SCRIPT. ..................................13 7.6 USE REGULAR EXPRESSIONS WHENEVER APPROPRIATE.............................................13 7.7 EXCEPTION AND ERROR HANDLING........................................................................13 7.8 INDENT BLOCKS OF CODE FOR READABILITY.............................................................14 7.9 AVOID HARD CODING TESTING ENVIRONMENT DEPENDENCIES........................................14 7.10 AVOID MSGBOX WHILE PRINTING THE RESULTS.......................................................14 7.11 USE CLOSE ALL BROWSERS FUNCTION. ...............................................................14 7.12 SCRIPT HEADER............................................................................................ 15 7.13 FUNCTION OR REUSABLE SCRIPT HEADER.............................................................15

7.14 DIVIDE THE SCRIPT INTO BLOCKS. .......................................................................16 7.15 COMMENT YOUR CODE TO MAKE IT MORE READABLE................................................16 7.16 VERIFY EACH PAGE IN THE FLOW........................................................................16 8 SCRIPTING CONVENTIONS.......................................................................... 16 8.1 TEST SCRIPTS.................................................................................................16 8.2 FUNCTIONS.....................................................................................................16 8.3 REUSABLE SCRIPTS..........................................................................................16 8.4 REPOSITORY FILES...........................................................................................17 9 BEST PRACTICES......................................................................................... 17

Document Revisions
Date Version Description Author

1Introduction
The intent of this document is to capture, information about the technical intricacies in building the test automation framework. This can be viewed as a reference guide for those who are involved in test suite maintenance and engineers who require technical know how about the test automation framework. It also provides coding conventions and scripting conventions to standardize coding style across in all the scripts. This document discusses about the following aspects Test Automation Architecture Quality Factors Design Time Environment Runtime Environment Test Script Logical Flow Coding Conventions Scripting Conventions Best Practices Sample Templates

Test Automation Architecture


The below block diagram shows the architecture at a high level. The Architecture framework consists of the following key items 1) 2) 3) 4) QTP Test Scripts Access Database Object Repository Library a. Business b. Generic 5) Test Director 6) Reports

2.1

QTP Test Scripts All the test scripts are replacing manual test cases which already written in Test Director. The steps in the manual test cases are eliminated/added/modified to suit it to automation. These Scripts are using reusable components from the library (Generic, Business) and reading current object information from the Object Repository. These scripts are taking the data from the access Database sheets. Access Database For all test scripts the test data is maintained in centralized area in the form of MS-Access Database. This architecture relies on data driven approach to get the data into the scripts. This approach requires test data to be maintained outside the test logic. Based on query given in the test script, the data is to be retrieved from database. Identify require test data based on the functionality and put it in separate sheets of the database. Data is retrieved in each script by firing appropriate sql using the ADODB infrastructure. Object Repository It is acting as an interface between QTP script and the application objects. All scripts in framework are interacting to the Object Repository. This Object Repository is shared across the test suite. Shared Object Repository is to be used in the current Architecture. The object repository maintained in centralized area. All add, delete, append and modify can be done in single place. This updating object repository should get effect to all the scripts in test suite. Library It helps to avoid the code duplication across the test suite and also helps to reuse the functions/tasks across the test suite. Library is divided into Generic and Business. a) Generic: Generic library incorporates canonical functions and procedures of the each framework that would be reused by all the test scripts in architecture. These generic functions are also able to contact Object Repository and Database sheets. b) Business: Business library pertains functions and procedures related to business tasks/functions of the application. These business functions are also able to contact Object Repository and Database sheets.

2.2

2.3

2.4

2.5

Test Director It is central repository to maintain and execute all the test scripts, object repository files, library (generic, business), database sheets and reports. It is used to drive automation scripts and generate reports. This is management test tool to maintain all the scripts in structured way. Reports This Architecture utilizes API provided by the test execution tool to report test status information. Framework enables, configurable test logging to database, utilizing the environment management abstraction.

2.6

Quality Factors
The overall design is done to meet the following Quality factors 1) 2) 3) 4) 5) 6) 7) Maintainability Reusability Simplicity Portability Reliability Usability Performance

3.1

Maintainability Maintainability is the ease with which a script can be corrected if an error is encountered, adapted if its environment changes, or enhanced if there is a requirement change. The design is done in such a way that it meets these criteria. Maintainability can be discussed with respect to the following sensitivities experienced by the Architecture. Data Sensitive Sensitivity with respect to test data is plausible from the below following aspects Database Design A two-layer indirection is maintained in the framework in order to insulate the changes in the database design from the test script logic. The database access from the test script is wrapped by database query. The Architecture utilizes Microsoft ADO infrastructure to interact with relational database, this enables framework to easily interact with various databases, independent of vendor. Test Data

3.1.1

Test Data maintenance is managed utilizing the export and import utilities provided by the database. Pre requisite Data Current Architecture handles pre requisite data by assuming that data would be present before running the test suite. This is managed by test documentation provided in the test management tool. Only in few instances prerequisite data is calculated/computed either from the application under test or using system APIs. 3.1.2 Behavior Sensitive Any change in common functionality of the system affects the test suite. The Architecture common library concept shields the impact of rework to minimum of maintaining at one place. User Interface Sensitive The User interface sensitivity is taken care by the test execution tool features. All user interface objects information is wrapped in the object repository. The object repository concept shields the underlying changes in physical description from the test script logic in turn minimizing the impact of change. To handle client side error and warning messages, a separate message table is constructed to minimize the impact of change in error message descriptions. This helps in shielding the test script changes due to minimal change in user error or warning descriptions. 3.2 Reusability The design of the script is done in such a way that already existing code segments can be reused in some other section of the test suite without having to rewrite the same code segment again. Constant effort is spent while creating the test suite to identify reusable chunks of code. The reusable code is classified into two categories. Canonical components are placed under generic library and businesses oriented are placed under business library in current architecture. Simplicity The script execution is not very complex so that a moderately efficient user can use the script to execute the automated test. Complex loops and logic is always suffixed/prefixed with comments necessary to make the code segment easier to follow. To make script simplicity follows coding conventions.

3.1.3

3.3

3.4

Portability Current Architecture easily integrates with test management and test harness tool (Test Director). User can execute automation test scripts easily from the Test Director. This architecture is designed in such way that packaging all test suite contents in a single folder and implement any ware in the Test Director folders. The standard way of folder structure is defined in section 4.1. Reliability Reliability can be discussed under the following sub topics Recovery from failures The framework currently handles predictable failures the corrective action undertaken is to stop test execution. Exception Handling Architecture utilizes exception-handling mechanism provided by the test execution tool to handled exceptions (for e.g. security window, security alert, incorrect data entered etc) Application Changes Changes in application, for e.g. if an expected web table or edit field is unavailable then framework utilizes the help of runtime exception thrown by test tool to exit from the test. Usability Usability can be discussed under the following sub topics Test Reporting Crisp test status reporting is implemented through out the framework for ease of test analysis utilizing the infrastructure built in the framework. Query Based test reporting is also feasible if database is queried for test status Documentation Documentation is provided both for end users who would want to execute the test suite (users guide) and users who want to maintain the test suite for future enhancements, bug fixes.

3.5

3.6

3.7

Performance The script should be written in such a way that it takes minimum amount of time and system resources for the test execution. Some factors to be cautiously borne in mind are: Usage of synchronization points Using wait statements

Loading a large amount of data into memory Proper checklists. Checking only the most necessary properties. The following events implemented in current architecture to improve the performance of the Test suite. Avoid too may wait statements in test scripts. Set global default wait time for all test scripts. Identify reusable components and use them efficiently. Get required test data for the current script in a single query.

Design Time Environment


Block diagram summarizing the design time environment

Test Script is uploaded into test plan tab of Test Director. Test Data is maintained in MS-Access Database. Get required test data for the current script by using database query.

Test Configuration Settings are maintained in the test setting options provided in the Tool Library, Object Repository files are under source control. 4.1 Folder structure For the current architecture is using the following folder structure.

Folder Data Lib Log Repository Test Scripts

Description/Comments It consists all test data in the form access data base tables. Which is accessible for all scripts and libraries in the suite. Contains generic library and business library functions used across the test scripts. This Library functions are able to access Repository and Data Sheets. It is used to store the log files information of the all test scripts. Contains complete application objects (buttons, windows, list boxes) information with logical names. All test scripts and library files are accessing this Repository. Here all automated test scripts are stored. These scripts are able to access Repository, Lib and Data.

Runtime Environment

Test Script is executed from the test lab of Test Director Test Data is retrieved from MS Access Database For each row in the table, the test script instance is executed Test status is recorded in the test director repository and to the database based on the option selected in the configuration settings.

Test Script Logical Flow


A typical test script flow is illustrated below Script header with all required data. Initialize Variables Get current location of the script. Depending on testing environment get the current location of the script and locate the Database, Object Repository, Library files. Invoke browser if needed, maximize the browser and login to Application, initialize connection with test director, get test run information from test director. Fetch test data Retrieve test script related test data by using query and cache the information in the record set object. Assign record set data to the script variables and clean up record set. Execute test logic & test conditions

For each row in the table for the test cases the test script is executed. Report observations At each functional checkpoint, the Application under test current information is retrieved and recorded in the log Cleanup Logout & minimize the browser

Coding Conventions
Coding conventions are help to design code in a structured way. The main reason for using these coding conventions is to standardize coding style across in all the scripts. Coding conventions can include the following: Naming conventions for objects, variables, and procedures. Commenting conventions. Text formatting and indenting guidelines

7.1

Variable Naming Conventions To enhance readability and consistency, use the following prefixes with descriptive names for variables in code.
Data Type Boolean Byte Date (Time) Double Error Integer Long Object Single String Prefix bln byt dtm dbl err int lng obj sng str Example blnFound bytRasterData dtmStart dblTolerance errOrderNum intQuantity lngDistance objCurrent sngAverage strFirstName

7.2

Constant Naming Conventions Constant variables are separated using the underscore (_) character. For example: CON_USER_LIST_MAX, CON_NEW_LINE Avoid usage of excessive wait statements. Try to use sync, WaitProperty or Exist functions when waiting is required in a script. For example: Browser(Bank One).page(wires).sync Browser(Bank One).page(wires).WebObject(OK).Exist

7.3

7.4

Use With and End with Statement With statement allow you to perform a series of statements on a specified object without re-qualifying the name of the object. For example, to change a number of different properties on a single object, place the property assignment statements within the with control structure, referring to the object once instead of referring to it with each property assignment. The following example illustrates use of the with statement to assign values to several properties of the same object. With Browser(Bank One).page(wires) WebList(From Account).Select AS2000002 WebButton(OK).Click End With

7.5

Use Option Explicit at the beginning of the test script. When you use the Option Explicit statement, you must explicitly declare all variables using the Dim, Private, Public, or ReDim statements. If you attempt to use an undeclared variable name, an error occurs. Tip Use Option Explicit to avoid incorrectly typing the name of an existing variable or to avoid confusion in code where the scope of the variable is not clear. The following example illustrates use of the Option Explicit statement. Option Explicit ' Force explicit variable declaration. Dim MyVar ' Declare variable. MyInt = 10 ' Undeclared variable generates error. MyVar = 10 ' Declared variable does not generate error.

7.6

Use Regular Expressions whenever appropriate Use regular expressions to avoid multiple Object instances in the object repository. For Example: Actual Date format: 05/19/2004 (mm/dd/yyyy) After Regular Expression: [0-1][0-9]/ [0-3] [0-9]/200[0-9] Exception and Error handling Using Exception handling, exceptions can be defined for situations where script execution maybe halted. Here the user can create a function to accept the error code or perform an action and if possible restart the test

7.7

execution (E.g.: Pressing enter key to close a popped up window) and proceed with the script execution if possible/necessary. 7.8 Indent blocks of code for readability It makes easy to understand the code in a single glance. For Example:
Do While intCount <= intItemsCount-1 With Browser("Bank").Page("One Net") .WebList(("Bank").Select 3 .WebButton(OK).Click .WebLink(Name).Click End With If StrItem > 0 Then FindItems = strItem If strPageRT="BUSINESS SUMMARY" Then Reporter.ReporEvent micPass,"Step1", Pass Else Reporter.ReporEvent micFail,"Step1", Fail End If End If Loop

7.9

Avoid hard coding testing environment dependencies Do not hardcode information which my change depending on testing environment. These include installation directories, DSN names, and names of database servers, database usernames, database passwords and database tables. It's better to define these in variables at the beginning of the test, so you do not have to make multiple changes throughout the script to implement an environmental change. For Example:
Set oWshell = Createobject("WScript.shell") strCurrentTestPath = oWshell.CurrentDirectory strRoot = Mid(strCurrentTestPath,1,8) strLibName = strRoot & "Lib\CommonLib.vbs" strDataSheetName= strRoot & "Data\Login.xls"

7.10

Avoid msgbox while printing the Results. Write results in the test results or output file in the script. Message boxes are stopping the script while it is running. Use Close all browsers function. Use this function at the end of each script. Because it helps us to continue next test script smoothly with out confusion

7.11

7.12

Script Header Script header includes the following attributes. Name, Author, Date, Description, Modified Date, Information about the Data tables, Parameters, reusable functions.

'********************************************************************** 'Script Name: 'Author: 'Date of Design: 'Date of last Revision: 'Data Sheets: 'Functions Called: 'Description: '**********************************************************************

7.13

Function or Reusable script header Function header consists the following attributes.

'********************************************************************* 'Function or Reusable script: 'Author: 'Date of Design: 'Date of last Revision: 'Purpose: 'Input Parameters: 'Output Parameters: '*********************************************************************

7.14

Divide the script into blocks. Functionality wise divide the script into various blocks and give proper comments. It makes the script more reusable, easy to enhance in future. Comment your code to make it more readable
Please use comments

7.15

7.16

Verify each page in the flow. Verify each page by using Reporter object and display the result message in the Test Result.

8
8.1

Scripting Conventions
Test Scripts The test script name Must consist of alphabets, numbers and _. Must start with an Upper Case letter. If more than one word is used, the starting of every word must be in Upper case. Name of a script must be in the format: Project_Module_Function_SubFunction or Project_Actor_Function_SubFuntion. Name of a scenario must be in the format: Project_SCN_Module_Function or Project_SCN_Actor_Function Functions Must consist of alphabets. Must start with Upper Case letter. Name must stand for the functionality of the function. If more than one word is used, the starting of every word should be in capitals. Numerical are not allowed in the names. Reusable Scripts Reusable Script name must consist of alphabets and start with Upper Case letter. Name should preferably be in the format Project_Module_Function or Project_Actor_Function.

8.2

8.3

8.4

Repository Files The Repository File name Must consist of alphabets Name must be of the format Project_Module_Function.tsr or Project_Actor_Function.tsr. If the name of an object or window in the Object Repository is ambiguous, then it must be renamed in the format ObjectType_Label.

Best Practices
Indent the code to make it readable and understandable for others. Avoid hard coding system specific values like folder locations, instead set it relatively with respect to the all scripts. Use synchronization functions instead of wait () function. Use Manual Test Case steps to drive scripting and report all pass and failure results with the Test Case numbers. Use Standard checkpoints for all static links, buttons and edit fields in every page. Use parameterization instead of using hard coded data to populate the AUT.

Você também pode gostar