Você está na página 1de 4

Performance Testing

1. Requirements gathering:

There will be requirement review meeting in which we will discuss about the scope of the
application (navigations, web services), changes that are made there from previous release,
target volumes, Test data, Types of tests.
MOM which includes the above things are mailed to everyone who involved.
Getting the navigation flows, test data from app dev team.

2. Shakedown:

Once after getting navigation snapshots for navigations, we will start navigation the flows with
the test data. We will raise defects if the application is not working as per the requirement.
For web services we will mine the data from internal tool. If we are unable find the data in that
then we will get the data from app dev team.

3. Scripting:

UI:

We will start recording the navigations once the application is working as per the requirement.
Enhance the recorded script by
a) Adding transactions for every activity done by the user
b) Parameterize the hardcoded values
c) Correlate the values from server
d) Add check points text checks
e) Adding headers.
Debug the script and validate the Test data.

Web_services:

We will take the XML request from the internal tool and make a script.
Parameterize the script
Add the check points
Validate the test data

4. Execution:

Upload all the scripts to performance center(PC)


Script validation: Create scenario with 1 Vuser for each script and run the scenario for 10 min.
Calculate the pacing using the average response from validation.
Smoke test: create the scenario with adding the pacing and Vusers and run the scenario for 30
min, this is called Dry Run
We will monitor response times in PC, CPU utilization and GC heap in Wily monitoring tool and
also check if any backend calls are failing.
If every is ok, we will recycle the servers and run load test for 4/2 hrs as the per the
requirements
5. Report publish:

Publish initial report with achieved target volumes, Failed Transaction Rate%, Transaction
summary, comparison with previous release results and observations monitored during the
test.
Publish detailed report which includes information about backend errors and, wily reports, JFR
and application logs.
We will get the detailed analysis report from performance engineering team.
We re-run the load test if there is any load balancing issue or high response times or target
didnt met.

6. Results review meeting & Signoff:

In results review meeting we will discuss about the results comparing with previous release
results.
Approvals.

1. Requirements gathering
Requirements review meeting
A. Scope of testing
B. Target volumes
C. Environment
D. POC for test data and navigations
E. Type of test
F. Test window
G. Misc
Gathering Flows, test data.

2. Shakedown
Navigating the flows
Defect raising and follow-up

3. Scripting

1. Recording
2. Enhancements
3. Replay and Debug
4. Validate Script with Different sets of Test data
5. Upload Scripts into Performance center(PC)

1. Recording:
Once after checking the application flows manually we will start recording the flow.
Pre tasks before going for recording:

Have to know about which protocol we have to use for recording.


Set recording settings in Vugen.

Recording the flow:

2. Enhancements:
A. Correlations
B. Using Functions
C. Parameterization
D. Adding check points - Text checks/Image checks
E. Think Times, Rendezvous Points
F. Transactions

3. Replay and Debug:

Setting up Run-time setting


Run the script

4. Validate Script with Different sets of Test data:

Placing the data that need to be validated


Setting up parameter setting

5. Upload Scripts into Performance center (PC)

Connecting to Performance center


Saving the script

4. Execution
Calculate the pacing
Create scenario
Dry Run
Execution
Monitor the test

5. Report publish
Analyze the results
Find the bottle necks
Publish the results.

Você também pode gostar