Você está na página 1de 59

Contents

1.0 Using JMeter for Load testing ........................................... 3


2.0 Jmeter Graphs ................................................................ 25
3.0 Jmeter Report Dashboard:.............................................. 31
4.0 Jmeter Distributed Testing ............................................. 53
Revision No. Description Prepared By Reviewed By Approved By

1.0 Jmeter for load Wilfred


testing basics
1.1 Jmeter Graphs Venkatesh
1.2 Report Dashboard Wilfred
1.3 Distributed testing Baskar
1.0 Using JMeter for Load testing

Step 1: Start JMeter by running the JMeter.bat file for Windows

Step 2: Create a thread group by right-clicking the Test plan element as shown in the picture below.
Right click Test plan->Add->Thread(Users)->Thread Group
Step 3: Create Http Request element by right-clicking the Thread Group element as shown in the
picture below. Right click Thread Group->Add->Sampler->Http Request

Step 4: Recording NXT forum Load test scenario. Add Http proxy server element to Workbench in
Jmeter Test plan as shown in picture below
Step 5: Set proxy settings in Firefox browser before recording script as shown in picture below. Set
blank page on browser loading as shown in picture below

Firefox browser->Tools->options->advanced->Network->settings. Set Proxy IP as localhost and port


as 8080 as shown in picture below
Step 6: Exclude .jpg, .JPEG, .css, .png, .gif and .js elements during recording as shown in picture
below

Select Http proxy server element-> URL Patterns to exclude add the following patterns to exclude
during script recording as shown in picture below

Click on ADD and enter below patterns in each text box

*.\.jpg

.*\.gif

.*\.JPEG

.*\.png

.*\.css

.*\.js
Step 7: Click on Start button in Http proxy server element screen as shown below and enter the URL
and start record the scenario
Step 8: After record the scenario Stop the Http proxy server by click on Stop button in Proxy element
as shown in picture below

Step 9: The recorded URLS are captured under the Thread group below http request sampler
mentioned in step 4 and as shown in picture below
Step 10: Add Http Cookie Manager to the Thread group as shown in the steps

Thread group->Add->config element->Http cookie Manager

Step 11: Add View results Tree element in Thread Group and shown in picture below

Thread group->Add->Listener->View Results Tree


Step 12: Steps to parameterize Login username and password. Add CSV data set config element to
the Thread Group. Thread Group->Add->config element->CSV Data set config
Step 13: click on CSV data set config element and enter variables names and filename of the login
data

Mem_no.txt
Filename-> mem_no.txt is a text file contains username and placed in bin folder.

Variable names -> username is the variable name and to be replaced with the recorded login
username
Step 14: Replace hardcoded username values with variable names as shown in picture below
Recorded username Variable username

Instead of Distributoer 251 replace with ${username}


Step 15: Use regular expression extractor for dynamic record id value in View Draft page . Add
Regular expression extractor to the URL as shown below.
Step 16: Steps to do dynamic record id

Reference name is the variable name to be replace with recorded value

Eg. Recordid temp1

15167 ${temp1} wherever 15167 comes in request we have replace with the ${temp1}

1.In this screen we have to select Response field to check as Headers since we get the record id
from the headers and not from response body

2.Apply to option should be Main sample and sub-samples

3.Regular expression pattern means left limit and right limit e.g the record id captures in between
the line record=15167&parenttab so we have write as

record=(.+?)&parenttab
Step 17: Replace the Variable name ${temp1} with the recordid values in all the URLs and parmeters
as shown below
Step 18: Run the jmeter script and verify the results in View results Tree tab

Response Assertion:

To Verify the text in the response and save the status of assertion in assertion results

Step 1 : Right Click on Invoice grid >> Add>>Assertions>>Response Assertion


Refer snapshot below.

Response Assertion: Add response assertion text

Step 2: Click on Add button and add the text Filters in Patterns to Test

Step 3: Select option Text response in Response Field to Test

Step 4: Select Option contains in Pattern matching rules

Refer snapshot below


Assertion Results : Pass result for Invoice grid page in assertion results
Assertion Results : Fail result for Invoice grid page in assertion results

Save response to File:

Right click on Thread Group >>Add>>Lister>>Save Responses to a File to add the listener
Save response to File:

Step 1: select option save Failed responses only

Step 2: Add timestamp

Step3: Enter Filename prefix c:\RESPONSES\ before create a folder in c drive or give any path

Step 4: After test completed we can see the responses for failed users are saved in mentioned path
refer snapshot below.
2.0 Jmeter Graphs
Step 1: Start JMeter by running the JMeter.bat file for Windows
Step 2: Create a thread group by right-clicking the Test plan element as shown in the picture below.
Right click Test plan->Add->Thread(Users)->Thread Group
Step 3: Create Http Request element by right-clicking the Thread Group element as shown in the
picture below. Right click Thread Group->Add->Sampler->Http Request
Step 4: Create a Graph Generator after recording the test scripts for the application by right-clicking
the Thread Group element as shown in the picture below. Right click Thread Group->Add-
>Listener>Graphs Generator

Step 5 : The path must be given for the Output folder and Jmeter Result file.
Step 6 : Anyone of the graph must be present with the Graph generator to get all the graphs
automatically after completion of the Load Test. The Graphs can be added manually through Right
click Thread Group->Add->Listener> Hits per Second

Step 7 : After the execution of Load Test if the graph is not automatically generated then go to the
command prompt and then type the data mentioned below:

jmeter -t graph_jmeter.jmx -n -l C:\responses-result.csv -JTEST_RESULTS_FILE=C:\responses-


results.csv

-t means use following jmx file.

-n means NON-GUI mode.

-l means location of the output of the result file.


Step 8 : Graphs Generator Listener generates the following graphs at end of test:

Active Threads Over Time

Response Times Over Time

Transactions per Second

Server Hits per Seconds

Response Codes per Second

Response Latencies Over Time

Bytes Throughput Over Time

Response Times vs Threads

Transaction Throughput vs Threads

Response Times Distribution

Response Times Percentiles


3.0 Jmeter Report Dashboard:

Step 1: Open Jmeter GUI using jmeter.bat file from Apache Jmeter3.1/bin folder.

Step2 : Create Test plan for Web Application.


Step 3 : Add Summary Report,Simple Data Writer from Listeners.

Step 4 : In Listener Simple Data Writer specify the Test Result csv file to save the results
automatically after test is run.
Step 5: In Simple Data writer click on configure button and ensure the below fields are selected

Save elapsed Time


Save Response message
Save success
Save sent byte count
Save Idle time
Save assertion results
Save Field names
Save Label
Save Thread Name
Save Assertion failure message
Save Active thread counts
Save Latency
Save Timestamp
Save Response code
Save Data type
Save Received byte count
Save Sub Results

Step 5: Jmeter report generation configuration: Copy reportgenerator in user properties


In Apache jmeter 3.1 version it has given reportgeneration properties file ,if you open that
file it will display all report generation configuration details, simply copy those
jmeter.reportgenerator configuration in User Properties file as below.

Step 6: Copy reportgenerator properties under reporting configuration to user.properties


and save.
#---------------------------------------------------------------------------
# Reporting configuration
#---------------------------------------------------------------------------

# Sets the satisfaction threshold for the APDEX calculation (in milliseconds).
jmeter.reportgenerator.apdex_satisfied_threshold=500

# Sets the tolerance threshold for the APDEX calculation (in milliseconds).
jmeter.reportgenerator.apdex_tolerated_threshold=1500

# Sets the size of the sliding window used by percentile evaluation.


# Caution : higher value provides a better accuracy but needs more memory.
#jmeter.reportgenerator.statistic_window = 200000

# Configure this property to change the report title


jmeter.reportgenerator.report_title=Apache JMeter Dashboard

# Defines the overall granularity for over time graphs


jmeter.reportgenerator.overall_granularity=60000

# Response Time Percentiles graph definition


jmeter.reportgenerator.graph.responseTimePercentiles.classname=org.apache.jmeter.repor
t.processor.graph.impl.ResponseTimePercentilesGraphConsumer
jmeter.reportgenerator.graph.responseTimePercentiles.title=Response Time Percentiles

# Response Time Distribution graph definition


jmeter.reportgenerator.graph.responseTimeDistribution.classname=org.apache.jmeter.repo
rt.processor.graph.impl.ResponseTimeDistributionGraphConsumer
jmeter.reportgenerator.graph.responseTimeDistribution.title=Response Time Distribution
jmeter.reportgenerator.graph.responseTimeDistribution.property.set_granularity=500

# Active Threads Over Time graph definition


jmeter.reportgenerator.graph.activeThreadsOverTime.classname=org.apache.jmeter.report.
processor.graph.impl.ActiveThreadsGraphConsumer
jmeter.reportgenerator.graph.activeThreadsOverTime.title=Active Threads Over Time
jmeter.reportgenerator.graph.activeThreadsOverTime.property.set_granularity=${jmeter.re
portgenerator.overall_granularity}

# Time VS Threads graph definition


jmeter.reportgenerator.graph.timeVsThreads.classname=org.apache.jmeter.report.processo
r.graph.impl.TimeVSThreadGraphConsumer
jmeter.reportgenerator.graph.timeVsThreads.title=Time VS Threads

# Bytes Throughput Over Time graph definition


jmeter.reportgenerator.graph.bytesThroughputOverTime.classname=org.apache.jmeter.rep
ort.processor.graph.impl.BytesThroughputGraphConsumer
jmeter.reportgenerator.graph.bytesThroughputOverTime.title=Bytes Throughput Over Time
jmeter.reportgenerator.graph.bytesThroughputOverTime.property.set_granularity=${jmeter
.reportgenerator.overall_granularity}
# Response Time Over Time graph definition
jmeter.reportgenerator.graph.responseTimesOverTime.classname=org.apache.jmeter.repor
t.processor.graph.impl.ResponseTimeOverTimeGraphConsumer
jmeter.reportgenerator.graph.responseTimesOverTime.title=Response Time Over Time
jmeter.reportgenerator.graph.responseTimesOverTime.property.set_granularity=${jmeter.r
eportgenerator.overall_granularity}

# Latencies Over Time graph definition


jmeter.reportgenerator.graph.latenciesOverTime.classname=org.apache.jmeter.report.proc
essor.graph.impl.LatencyOverTimeGraphConsumer
jmeter.reportgenerator.graph.latenciesOverTime.title=Latencies Over Time
jmeter.reportgenerator.graph.latenciesOverTime.property.set_granularity=${jmeter.reportg
enerator.overall_granularity}

# Response Time Vs Request graph definition


jmeter.reportgenerator.graph.responseTimeVsRequest.classname=org.apache.jmeter.report
.processor.graph.impl.ResponseTimeVSRequestGraphConsumer
jmeter.reportgenerator.graph.responseTimeVsRequest.title=Response Time Vs Request
jmeter.reportgenerator.graph.responseTimeVsRequest.exclude_controllers=true
jmeter.reportgenerator.graph.responseTimeVsRequest.property.set_granularity=${jmeter.r
eportgenerator.overall_granularity}

# Latencies Vs Request graph definition


jmeter.reportgenerator.graph.latencyVsRequest.classname=org.apache.jmeter.report.proce
ssor.graph.impl.LatencyVSRequestGraphConsumer
jmeter.reportgenerator.graph.latencyVsRequest.title=Latencies Vs Request
jmeter.reportgenerator.graph.latencyVsRequest.exclude_controllers=true
jmeter.reportgenerator.graph.latencyVsRequest.property.set_granularity=${jmeter.reportge
nerator.overall_granularity}

# Hits Per Second graph definition


jmeter.reportgenerator.graph.hitsPerSecond.classname=org.apache.jmeter.report.processo
r.graph.impl.HitsPerSecondGraphConsumer
jmeter.reportgenerator.graph.hitsPerSecond.title=Hits Per Second
jmeter.reportgenerator.graph.hitsPerSecond.exclude_controllers=true
jmeter.reportgenerator.graph.hitsPerSecond.property.set_granularity=${jmeter.reportgener
ator.overall_granularity}

# Codes Per Second graph definition


jmeter.reportgenerator.graph.codesPerSecond.classname=org.apache.jmeter.report.proces
sor.graph.impl.CodesPerSecondGraphConsumer
jmeter.reportgenerator.graph.codesPerSecond.title=Codes Per Second
jmeter.reportgenerator.graph.codesPerSecond.exclude_controllers=true
jmeter.reportgenerator.graph.codesPerSecond.property.set_granularity=${jmeter.reportgen
erator.overall_granularity}
# Transactions Per Second graph definition
jmeter.reportgenerator.graph.transactionsPerSecond.classname=org.apache.jmeter.report.
processor.graph.impl.TransactionsPerSecondGraphConsumer
jmeter.reportgenerator.graph.transactionsPerSecond.title=Transactions Per Second
jmeter.reportgenerator.graph.transactionsPerSecond.property.set_granularity=${jmeter.rep
ortgenerator.overall_granularity}

# HTML Export
jmeter.reportgenerator.exporter.html.classname=org.apache.jmeter.report.dashboard.Html
TemplateExporter

# Sets the destination directory for generated html pages.


# This will be overridden by the command line option -o
jmeter.reportgenerator.exporter.html.property.output_dir=/tmp/test-report

Step 7: Jmeter report generation configuration: Copy SAVE Service configurations also in
User Properties file and save .

#------------------------------------------------------
#SAVE SERVICE Configurations
#---------------------------------------------------------
jmeter.save.saveservice.bytes = true
jmeter.save.saveservice.label = true
jmeter.save.saveservice.latency = true
jmeter.save.saveservice.response_code = true
jmeter.save.saveservice.response_message = true
jmeter.save.saveservice.successful = true
jmeter.save.saveservice.thread_counts = true
jmeter.save.saveservice.thread_name = true
jmeter.save.saveservice.time = true
jmeter.save.saveservice.print_field_names=true
jmeter.save.saveservice.data_type=true
jmeter.save.saveservice.subresults=true
jmeter.save.saveservice.print_field_names=true
jmeter.save.saveservice.assertions=true
jmeter.save.saveservice.idle_time=true

# the timestamp format must include the time and should include the date.
# For example the default, which is milliseconds since the epoch:
#jmeter.save.saveservice.timestamp_format = ms
# Or the following would also be suitable
jmeter.save.saveservice.timestamp_format = dd/MM/yyyy HH:mm
#save service assertion
jmeter.save.saveservice.assertion_results_failure_message = true

Step 8: Execute the Test Plan as and once test is completed then simple data Writer
generates Test Results file as specified location as below

Step 9:

Step 10:After execute the above command the below screen message Writing log file to
as display in cmd prompt for successful generation.The dashboard result graph will be
generated in HTML folder as displayed below .click on inde.html and view all graphs in single
dashboard.

Dasboard graphs
Refer: https://learn-jmeter.blogspot.in/2016/10/how-to-generate-jmeter-report-dashboard.html

4.0 Jmeter Distributed Testing


First Step: System configuration

Go to the slave systems -> jmeter/bin directory to execute jmeter-server.batfile.

On windows, a slave machine which belongs to 192.168.0.10 IP address, looks like the given below
figure,

In the master systems, just visit /bin directory to editjmeter.properties file, adds IP slave machine as
shown below:
Second Step: Run the test

Open the test plan to run JMeter GUI, using the master machine.

Click on Run on the menu bar then choose Remote start -> select the IP address of slave machine.

or Clck on Remotte All to run the test for all slave ips.

BytesThroughputOverTime
HitsPerSecond

ResponseCodesPerSecond
ThroughputVsThreads
TransactionsPerSecond

Response Time Graph

Você também pode gostar