Executing Load Test Programs

Introduction

Executing a Load Test is started in the Project Navigator. The icon to the right of the *.class files will have a red arrow.


The icon to the right of the Job Template (*.xml) files will have a green arrow and will directly open a Start Load Test Job, which is covered here towards the end of this page).


Load Test (*.class) files.

Execute Load Test Steps in the Project Navigator

After the Project Navigator has called the load test program, you must enter the test input parameters for the test run (a single execution of the load test program is also called “test run”).

The most important parameters are the Number of Concurrent Users and Load Test Duration. For short-duration Load Tests, Apica recommends 100% URL sampling. We also recommend entering a small comment about the test run into the Annotation input field.

If evaluating for browser performance, please select any Browser Emulation and Caching Options needed.

Execute LT Input Fields

Field

Description

Field

Description

Save as template

Stores all load test input parameters additionally inside an XML template. Later, this template can be used to rerun (repeat) the same load test.

Execute Test From

selects the Exec Agent or the Exec Agent Cluster from which the load test will be executed.

Apply Execution Plan

Optionally, an Execution Plan can be used to control the number of users during the load test. The dropdown list shows the Execution Plans' Titles, extracted from all formal valid Execution Plan files (*.exepl files), located in the current Project Navigator directory. Note that the titles of invalid Execution Plan files are not shown. If an Execution Plan is selected, then the following input parameters are disabled: Number of Concurrent Users, Load Test Duration, Max. Loops per User, and Startup Delay per User.

Number of Concurrent Users

The number of simulated, concurrent users.

Load Test Duration

The planned duration of the load test job. If this time has elapsed, all simulated users will complete their current loop (repetition of web surfing session) before the load test ends. Thus the load test often runs a little bit longer than the specified test duration.

Max. Loops per User

This limits the number of web surfing session repetitions (loops) per simulated user. The load test stops if the limit has been reached for each simulated user.

Note: this parameter can be combined with the parameter "Load Test Duration" The limitation which first occurs will stop the load test.

Pacing

Minimum Loop duration per User. Enabling this option sets a minimum time for all in the iteration executed page breaks and URL calls which must be elapsed before the next iteration starts.
If the iteration completes earlier than the pacing time, the user will be inactive until the pacing time has been reached.

Startup Delay per User

The delay time to start an additional concurrent user (startup ramp of load). Used only at the start of the load test during the creation of all concurrent users.

Max. Network Bandwidth per User

The network bandwidth limitation per simulated user for the downlink (speed of the network connection from the webserver to the web browser) and the uplink (speed of the network connection from the web browser to the webserver). By choosing a lower value than "unlimited," this option allows simulating web users with a slow network connection.

Request Timeout per URL

The timeout (in seconds) per single URL call. If this timeout expires, the URL call will be reported as failed (no response from the webserver). Depending on the corresponding URL's configured failure action, the simulated user will continue with the next URL in the same loop, or it will abort the current loop and then continue with the next loop.

Max. Error-Snapshots

Limits the maximum number of error snapshots taken during load test execution. The maximum memory used to store error snapshots can be configured (recommended - for cluster jobs: value overall cluster members). The maximum number of error snapshots per URL can be configured (not recommended for cluster jobs: value per Exec Agent).

Statistic Sampling Interval

statistic sampling interval during the load test in seconds (interval-based sampling). Used for time-based overall diagrams like, for example, the measured network throughput.

If you run a load test over several hours, you must increase the statistic sampling interval to 10 minutes (600 seconds) to save memory. If the load test runs only some minutes, you may decrease the statistic sampling interval.

Additional Sampling Rate per Page Call

Captures the measured response time of a web page when a simulated user calls a web page (event-based sampling). Used to display the response time diagrams in real-time and in the Analyse Load Test Details menu. For endurance tests over several hours, it is strongly recommended that the sampling rate for web pages is set between 1% and 5%. For shorter tests, a 100% sampling rate is recommended.

For endurance tests over several hours, Apica strongly recommends setting the sampling rate for web pages between 1% and 5%. We recommend a 100% sampling rate for shorter tests.

Additional Sampling Rate per URL Call

captures the measured response time of a URL each time when a simulated user calls a URL (event-based sampling). Used to display the response time diagrams in real-time and in the Analyse Load Test Details menu.

In addition to capturing the URL calls' response time, further data can be captured using one of the Add options.

--- recommended: no additional data are captured

Performance Details per Call: additionally, the TCP/IP socket open time (network establish time), the request transmit time, the response header wait time, the response header receive time, and the response content receive time URL calls are captured.

TCP/IP Client Data: additionally the (load generator) TCP/IP client address, the TCP/IP client port, the network client-socket create date, the reuse count of the client-socket (keep-alive), and the SSL session ID (for encrypted connections only) are captured. This option also includes the option "Performance Details per Call."

Resp. Throughput Chart per Call: additionally, in-depth throughput data of the received HTTP response content are captured and displayed as a chart (stream diagram of response). This option also includes the option "Performance Details per Call."

Request Headers: additionally, the request headers of all URL calls are captured. This option also includes the option "Performance Details per Call."

Request Content: additionally, the request content data of all URL calls are captured. This option also includes the option "Performance Details per Call."

Request Headers & Content: additionally, the request headers and request content data (form data) of all URL calls are captured. This option also includes the option "Performance Details per Call."

Response Headers: additionally, the response headers of all URL calls are captured. This option also includes the option "Performance Details per Call."

Response Headers & Content: additionally, the response headers and the response content data of all URL calls are captured. This option also includes the option "Performance Details per Call" and "Resp. Throughput Chart per Call".

All - But w/o Response Content: additionally, the request headers, the request content data, and the response headers of all URL calls are captured. This option also includes the option "Performance Details per Call" and "Resp. Throughput Chart per Call".

All - Full URL Snapshots: additionally, all data of the URL calls are captured. This option also includes the option "Performance Details per Call" and "Resp. Throughput Chart per Call".

Debug Options

Choosing any debug option (other than "none") affects that additional information is written to the *.out file of the load test job. The following debug options can be configured:

none - recommended
Recommended default value. Note that error snapshots are still taken, and therefore special debug options are normally not necessary to analyze a measured error.

debug failed loops
Write the performed execution steps of all failed loops to the *.out file of the load test job.

debug loops
Write the performed execution steps of all loops to the *.out file of the load test job.

debug headers & loops
Write additionally debug information about all transmitted and received HTTP headers to the *.out file of the load test job.

debug content & loops
Write additionally debug information about all transmitted and received content data to the *.out file of the load test job (without binary content data like images).

debug cookies & loops
Write additionally debug information about all transmitted and received cookies to the *.out file of the load test job.

debug keep-alive & loops
Write additionally debug information about the behavior of re-used network connections to the *.out file of the load test job.

debug SSL handshake & loops
Write additionally debug information about the SSL protocol and the SSL handshake to the *.out file of the load test job.

Additional Options

Several additional options for executing the load test can be combined by adding a blank character between each of the options. The following additional options can be configured.

-multihomed
Initiates all Exec Agents to use multiple local IP addresses when executing a load test. This option allows simulating traffic from more than one IP address per Exec Agent. This option is only considered if the Exec Agent supports a multihomed network configuration (several IP addresses assigned to the same host). The first step to use this option is to configure the Windows or Unix operating system multiple IP addresses for the same host. The second step is to assign these IP addresses to the Exec Agent configuration. For the localhost, where the Web Admin GUI is running, the second step can be done by calling the Setup menu inside the Project Navigator (gear-wheel icon at the top navigation). For remote Exec Agents, you have to edit the file javaSetup.dat, located inside the ZebraTester installation directory - by modifying the entry value javaVirtualIpAddresses: enter all IP addresses of the host on the same line, separated by comma characters.
This option's effect is that each concurrent user uses its own client IP address during the load test. If fewer IP addresses are available than concurrent users are running, the IP addresses are averaged across the users.

-ipperloop
Using this option combined with the option -multihomed effects that an owned local IP address is used for each executed loop rather than for each simulated user.
This option is considered only if also the option -multihomed is used.

-tconnect <seconds>
Set a timeout in seconds to open a TCP/IP socket connection to the Web server. If the time is exceeded, the URL call is aborted and marked as failed. Note that the value must be greater than zero but should be less than "Request Timeout per URL."

-dnshosts <file-name>
Effects: The load test job uses an own DNS hosts file to resolve hostnames rather than the underlying operating system's hosts file.
Note that you have to ZIP the hosts file together with the compiled class of the script. To automate the ZIP, it's recommended to declare the hosts file as an external resource (w/o adding it to the CLASSPATH).

-dnstranslation <file-name>
Effects: The load test job uses a DNS translation file, a text file containing a translation between two DNS names. If the first DNS name in the file matches the DNS name passed to the resolver, then the second DNS name is used to resolve the IP address.
The first DNS name can also contain one or more wildcard characters ('*' = wildcard for multiple characters, ‘?' = wildcard for a single character). Lines or a part of a line can be commented on using the hash char '#.’

-dnssrv <IP-name-server-1>[,<IP-name-server-N>]
Effects that the load test job uses specific (own) DNS server(s) to resolve hostnames - rather than to use the DNS library of the underlying operating system.
When using this option, at least one IP address of a DNS server must be specified. Multiple DNS servers can be configured separated by commas. If a resolved DNS hostname contains multiple IP addresses, the stressed Web servers are called in a round-robin order (user 1 uses resolved IP Address no. 1, user 2 uses resolved IP Address no. 2, etc.).

-dnsenattl
Enable consideration of DNS TTL by using the received TTL-values from the DNS server(s).
This option cannot be used in combination with the option -dnsperloop.
Note: when using this option, the resolved IP addresses (and therefore the stressed Web servers) may alter inside the executed loop of a simulated user at any time - suddenly from one URL call to the next one.

-dnsfixttl <seconds>
Enable DNS TTL by using a fixed TTL-value of seconds for all DNS resolves. This option cannot be used in combination with the option.

-dnsperloop
Perform new DNS resolves for each executed loop. All resolves are stable within the same loop (no consideration of DNS TTL within a loop).
This option cannot be used in combination with the options -dnsenattl or -dnsfixttl.
Note: consider when using this option that the default or the configured DNS servers are stressed more than usual because each simulated user's executed loop will trigger one or more DNS queries.

-dnsstatistic
Effects that statistical data about DNS resolutions are measured and displayed in the load test result using a DNS stack on the load generators.
Note: there is no need to use this option if any other, more specific DNS option is enabled because all (other) DNS options also effect implicitly that statistical data about DNS resolutions are measured. If you use this option without any other DNS option, the (own) DNS stack on the load generators will communicate with the default configured DNS servers of the operating system - but without considering the "hosts" file.

-dnsdebug
Effects that debug information about the DNS cache and DNS resolves are written to the stdout file (*.out) of the load test job.

-enableIPv6 [<network-interface-name>]
Enable IPv6 support only for load test execution (IPv4 disabled). Optionally you also can provide the IPv6 network interface name of the load generators(s), like "eno," for example.

-enableIPv6v4 [<network-interface-name>]
Enable IPv6 and IPv4 support for load test execution (first will try with IPv6, if fails, will try with IPv4). Optionally you also can provide the IPv6 network interface name of the load generators(s), like "eno," for example.

-mtpu <number>
Allows configuring how many threads per simulated user are used to process URLs in parallel (simultaneously). Note: This value applies only to URLs that have been configured to be executed in parallel.

-nosdelayCluster
Effects for Cluster Jobs that the Startup Delay per User is applied per Exec Agent Job instead of applying it overall simulated users of the Cluster Job. Thus a faster ramp-up of load can be achieved.

-setuseragent "<text>"
Replaces the recorded value of the HTTP request header field User-Agent with a new value. The new value is applied for all executed URL calls.

-noECC
Disable elliptic curves (ECC).

-sslcache <seconds>
Alters the timeout of the user-related SSL cache. The default value is 300 seconds. A value of 0 (zero) affects that the SSL cache is disabled.

-sslrandom <type>
Set the type of random generator used for SSL handshakes. Possible options are "fast", "iaik" (default) or "java".

-sslcmode
Apply SSL/HTTPS compatibility workarounds for deficient SSL servers. You may try this option if you constantly get the error type "Network Connection aborted by Server" for all URL calls.

-nosni
Disable support for TLS server name indication (SNI).

-snicritical
Set the TLS SNI extension as critical (default: non-critical).

-tlssessiontickets
Set the TLS to use Session Tickets for session resuming (non-critical).

-iaikLast
Adds the IAIK security provider at the last position (instead of the default: IAIK at first position).
Note: Adding the IAIK security provider at the last position may have the side effect that weak or short cipher keys are used.

-tz <timezone>
Sets an alternatively time zone which is used by the script. The default time zone is equal to the selection made when installing ZebraTester, or - if modified subsequently - which has been set in the Personal Settings menu. Possible time zone values are described in chapter 6 of the Application Reference Manual.

-Xbootclasspath/a:<path>
Specify for the load test job a path of JAR archives and ZIP archives to append to the default bootstrap classpath.

-Xbootclasspath/p:<path>
Specify for the load test job a path of JAR archives and ZIP archives to prepend in front of the default bootstrap classpath.

-Xmx<megabytes>
Specify for the load test job the size of the Java memory in megabytes. Do not enter a space or a colon between the "-Xmx" and the value.
Note: this option can only be used if the corresponding Exec Agent(s) supports this option, meaning that the Exec Agent(s) is started with the option -enableJobOverrideJavaMemory.

-nostdoutlog
Disable to write any data to the stdout file of the load test job.

SSL: specifies which HTTPS/SSL protocol version should be used:

All: Automatic detection of the SSL protocol version. ZebraTester prefers the TLS 1.3 or TLS 1.2 protocol, but if the Web server does not support this, TLS 1.1, TLS 1.0, or SSL v3 is used. This is the normal behavior that is implemented in many Web browser products.

v3: Fixes the SSL protocol version to SSL v3.
TLS: Fixes the SSL protocol version to TLS 1.0.
TLS11: Fixes the SSL protocol version to TLS 1.1.
TLS12: Fixes the SSL protocol version to TLS 1.2.
TLS13: Fixes the SSL protocol version to TLS 1.3.

Browser Emulation

User-Agents and Caching

  • User-Agent Selection: This option is used to create a custom user agent string or select a user agent from the available list.

  • Browser Cache: This option emulates the cache setting of a real browser.

    • Check for newer versions of stored pages every time: when enabled, ZebraTester will check for later versions of the specified URL than those stored in the cache.

    • Cache URLs with HTML Content: when enabled, ZebraTester will cache the HTML resources as well. You can decrease the memory footprint of each VU by unchecking this option.

    • Simulate a new user each loop: when enabled, ZebraTester will create a new cache per loop.

Annotation

Enter a short comment about the test run, such as purpose, current web server configuration, and so on. This annotation will be displayed on the result diagrams.


Starting Exec Agent Jobs

If you have specified that a single Exec Agent executes the load test program (but not by an Exec Agent Cluster), the load test program is transmitted to the local or remote Exec Agent, and a corresponding load test job - with a job number - is created locally within the Exec Agent. The job is now in the state “configured”; that is, ready to run, but the job is not yet started.

Hint: each Exec Agent always executes load test jobs as separate background processes and can execute more than one job at the same time. The option Display Real-Time Statistic only means that the GUI opens an additional network connection to the Exec Agent, which reads the real-time data directly from the corresponding executed load test program's memory space.

  • Click the Start Load Test Job button to start the job.

If you have de-selected the checkbox Display Real-Time Statistic, the window will close after a few seconds; however, you can - at any time - access the real-time statistic data, or the result data, of the job by using the Jobs menu, which can be called from the Main Menu and also from the Project Navigator.

Alternatively, the load test program can also be scheduled to be executed at a predefined time. However, the corresponding Exec Agent process must be available (running) at the predefined time because the scheduling entry is stored locally inside the Exec Agent jobs working directory, which the Exec Agent itself monitors. Especially if you have started the local Exec Agent implicitly by using the ZebraTester Console - AND if the scheduled job should run on that local Exec Agent, you must keep the ZebraTester Console Window open so that the job will be started ¹.

¹ This restriction can be avoided by installing the Exec Agent as a Windows Service or as a Unix Daemon (see Application Reference Manual).


Real-Time Job Statistics (Exec Agent Jobs)

Real-time statistics shown in this window are updated every 5 seconds for as long as the load test job is running.

You may abort a running load test job by clicking on the Job Actions button, which includes.

  • Abort Job: Aborting a job will take a few seconds because the job writes out the statistic result file (*.prxres) before it terminates.

  • Suspend Job

  • Increase Users

  • Decrease Users

  • Abort Increase/Decrease Users

  • Extend Test Duration

  • Reduce Test Duration

Item

Description

Item

Description

<Exec Agent Name> or <Cluster Name>

The Exec Agent's name - or the Exec Agent Cluster's name - executes the load test job.

Job <number>

Unique job ID (unique per Exec Agent, or unique cluster job ID).

Real-Time Comment

If real-time comments are entered during test execution, these comments are later displayed inside all time-based diagrams of the load test result detail menu.

Job Parameter

The name of the load test program and the program arguments - test input parameter.

Diagram

Description

Diagram

Description

Web Transaction Rate

 

The Web Transaction Rate Diagram shows the actual number of (successful) completed URL calls per second, counted overall simulated users. By clicking on this diagram, the Response Time Overview Diagrams are shown.

  • Total Passed URL Calls - The total number of passed URL calls since the load test job was started.

  • Total Failed URL Calls - The total number of failed URL calls since the load test job was started.

  • HTTP Keep-Alive Efficiency (%) - The efficiency in percent about how often a network-connection to the webserver was successfully re-used, instead of creating a new network connection. This (floating) average value is calculated since the load test job was started.

  • AV Web Trans. Rate (URL calls/sec) - The (floating) average number of (successful) completed URL calls per second, calculated since the load test job was started

Session Failures / Ignored Errors

The Session Failures / Ignored Errors Diagram shows the actual number of non-fatal errors (yellow bars) and the number of fatal errors (red bars = failed sessions) counted over all simulated users.

  • Total Passed Loops - The total number of passed loops (repetitions of web surfing sessions) since the load test was started.

  • Total Failed Loops - The total number of failed loops (repetitions of web surfing sessions) since the load test was started.

  • Σ User's Think Time per Loop (sec): The total user's think time in seconds for one loop per simulated user.

  • Session Time per Loop (sec): The average session time for one loop per simulated user. This value is the sum of "the average response time of all URLs and all user's think times" per completed loop.

The Number of Users / Waiting Users Diagram shows the total number of currently simulated users (red bars) and the actual number of users who are waiting for a response from the webserver (purple bars). The users waiting for a response is a subset of the currently simulated users.

By clicking on this diagram, the Statistical Overview Diagrams are shown.

  • Users Waiting For Response - the actual number of users waiting for a response from the web server, compared to ("of") the total number of currently simulated users.

  • TCP Socket Connect Time (ms) - The time in milliseconds (per URL call) to open a new network connection to the webserver.

  • AV Network Throughput (Mbit/s) - The total network traffic generated by this load test job, measured in megabits per second. This (floating) average value is calculated since the load test job was started.

  • Total Transmitted Bytes - The total number of transmitted bytes, measured since the load test job was started.

More actual measurement details are available by clicking on the Detailed Statistic button. Especially, an overview of the current execution steps of the simulated users is shown:

The most relevant measured values of the URLs are shown for the selected page by clicking on the page's magnifier icon.

  • Using this menu, you can also display and analyze error snapshots by clicking on the magnifier icon next to the failure counter. In this way, you can begin analyzing errors immediately as they occur - during the running load test.

  • By clicking on a URL, the corresponding URL Response Time Diagram is shown.

All of these detailed data, including all error data, are also stored inside the final result file (.prxres), which can be accessed when the load test job has been completed.

Response Time Overview Diagrams (Real-Time)

Description: displays during the load test (at real-time) a diagram per web page about the measured response times.

Please consider that maybe only a fraction of the response times are shown, depending on the Additional Sampling Rate per Page Call, which was selected when the load test was started. For example, only every fifth response time is shown if the Additional Sampling Rate per Page Call was set to 20%.

Input Fields

Response Time (drop-down list)

Allows to select the period, from the current time back to the past, are shown in the diagrams within the response times.

Time Bars (drop-down list)

Allows selecting if the bars inside the diagrams are shown as average values or as max. values. Please note that there is only a difference between the max. values and the average values if multiple measured samples of the response time fall inside the same pixel (inside the same displayed bar):.

Diagram

Description

Diagram

Description

 

The tables at the right side of the diagrams contain the response times for all URLs of the web page. Also, these response times are either average values or max. values, depending on the selection in the Time Bars drop-down list. However, these values are calculated since the load test was started and always "accurately" measured, which means that they do not depend on the value chosen for the "Additional Sampling Rate per Page Call."

 

You can click on a URL response time to show the corresponding URL Response Time Diagram.

 

On the left side inside the diagram, the web page's average response time is shown as red-colored text, calculated since the load test was started. But depending on the selected period, this value may not be displayed in every case. On the right side inside the diagram, the last measured value is shown.

 

URL Response Time Diagram (Real-Time)

Description: displays during the load test (at real-time) the response times of a URL and also a summary diagram about the measured errors of the URL.

Please consider that maybe only a fraction of the response times are shown, depending on the Additional Sampling Rate per URL Call, which was selected when the load test was started. For example, only every fifth response time is shown if the "Additional Sampling Rate per URL Call" was set to 20%.

Input Fields

Response Time (drop-down list)

Allows to select the period, from the current time back to the past, are shown inside the diagram within the response times.

Time Bars (drop-down list)

Allows selecting if the bars inside the diagram are shown as average values or as max. values. Please note that there is only a difference between the max. values and the average values if multiple measured samples of the response time fall inside the same pixel (inside the same displayed bar).

Info Box / Measured Values

All values in this infobox are calculated overall completed calls of the URL, measured since the load test was started. These values are always "accurately" measured, which means that they do not depend on the value chosen for the "Additional Sampling Rate per URL Call."

Total Passed URL Calls

the total number of passed calls for this URL.

Total Failed URL Calls

the total number of failed calls for this URL.

Average Size (Req. + Resp.)

the average size of the transmitted + received data per URL call.

Max. Response Time

the maximum response time ever measured.

Min. Response Time

the minimum response time ever measured.

Av. TCP Socket Connect Time

the average time to open a new network connection to the webserver, measured for this URL. a blank /[---] instead of a value means that never a new network connection was opened for this URL because HTTP Keep-Alive (re-using of cached network connections) was always successful. The additional percentage value shown in brackets at the left-hand displays the percentage of how often a new network connection was opened to the web server, in comparison to how often this was not necessary. This percentage value is also called the reverse keep-alive efficiency.

Av. Request Transmit Time

the average time to transmit the HTTP request header + (optionally) the HTTP request content data (form data or file upload data) to the webserver, measured after the network connection was already established.

Av. Response Header Wait Time

the average time for waiting for the first byte of the web server response (-header), measured since the request has (completely) transmitted to the webserver.

Av. Response Header Receive Time

the average time for receiving the HTTP response header's remaining data, measured since the first byte of the response header was received.

Av. Response Content Receive Time

the average time for receiving the response content data, for example, HTML data or the data of a GIF image.

Average Response Time

the average response time for this URL. This value is calculated as \\.

 

URL Errors / Real-Time Profile of Error Types: This diagram shows an overview of what kind of errors did occur for the URL at which time, measured since the load test was started. This "basic error information" is always "accurately" measured independently of the value chosen for the "Additional Sampling Rate per URL Call" - and captured in every case, also if no more memory is left to store full error snapshots.

Error Overview Diagrams (Real-Time)

Description: Displays during the load test (at real-time) an overview of all occurred errors.

Failure Diagrams: The first diagram shows an overview of what kind of errors did occur, counted overall URLs, and measured since the load test was started. This "basic error information" is always captured in every case, also if no more memory is left to store full error snapshots.

The succeeding diagrams, shown per web page, provide only information at which time errors occurred. The tables on the right side of the diagrams show the number of errors that did occur on the URLs of the web page. You can click on an error counter to show the error detail information (error snapshots) for the corresponding URL. First Error Snapshots: Displays a list of errors at first (at the start of the load test). By clicking on a magnifier icon, the corresponding error detail information (error snapshot) is shown.

Latest Error Snapshots: Displays a list of the latest (newest) errors. By clicking on a magnifier icon, the corresponding error detail information (error snapshot) is shown.

Input Fields

All failed URL Calls

effects that all errors about failed URL calls are shown (non-fatal and fatal errors).

Session Failures only

effects that only fatal errors about failed URL calls are shown (session failures).

 

Error Snapshot Memory: % used +: By clicking on the + (plus sign), you can increase the amount of memory available to store error snapshots. Please Note: when the memory is already 50% or more used, no additional error snapshots for non-fatal errors are captured. This means that increasing the memory may also re-enable capturing for non-fatal errors.

Statistical Overview Diagrams (Real-Time)

Description: displays statistical overview diagrams (in real-time) about a load test job.

Concurrent Users

The total number of simulated users.

Users Waiting For Response

The number of users who are waiting for a response from the webserver.

Session Failures

The number of failed sessions - which is the same as the number of fatal errors.

Session Time per User - per Loop

The session time for one loop per simulated user. This value is the sum of "the response time of all URLs and all user's think times" per successfully completed loop.

Web Transaction Rate

The number of (successful) completed URL calls per second measured overall simulated users.

Completed Loops per Minute

The number of (successful) completed loops (sessions) per minute measured overall simulated users.

TCP Socket Connect Time

The time in milliseconds (per URL call) to open a new network connection to the webserver.

Network Throughput

The total network traffic which is generated by this load test job, measured in megabits per second.

Real-Time Comments

Description: supports entering comments during the load test execution.

Real-time comments are notes or Tips, which you can enter during the load test execution:

 

These comments are later displayed inside all time-based diagrams of the load test result detail menu :

 

You can also modify, delete or add real-time comments before you generate the PDF report. However, all retroactively entered real-time comments are not permanently stored inside the result data.

 

 

 

 

 

 

 

 

 

Loading the Statistics File

After the load test job has been completed, the statistic results file (*.prxres) is stored in the local or remote Exec Agent's job directory. To access this results file, you must transfer it back to the (local) Project Navigator directory from which the load test program was started.

This menu shows all the load test files' files; however, only the statistics results file is usually needed, and this is already selected. The "*.out" file contains debug information, and the "*.err" file is either empty or contains internal error messages from the load test program itself.

By clicking on the Acquire Selected Files button, all selected files are transferred (back) to the (local) Project Navigator directory.

If the checkbox Load *.prxres File on Analyze Load Test Menu is selected, the statistics results file is also loaded into the memory area of the Analyze Load Tests menu, where the statistics and diagrams of the measured data can be shown, analyzed, and compared with results of previous test runs.


Starting Cluster Jobs

If you have specified that an Exec Agent Cluster executes the load test program, the load test program is transmitted to the local cluster job controller, coordinating all cluster members (Exec Agents). The cluster job controller creates a cluster job and allocates a cluster job number. The cluster job is now in the state “configured” (ready to run, but not yet started).

The number of concurrent users will be automatically distributed across the cluster members, depending on the individual computer systems' capability - called "load factor."

If the load test program uses Input Files, you are asked for each Input File - if you wish to split the Input File content. This can be useful, for example, if the Input File contains user accounts (usernames/passwords) but the web application does not allow duplicate logins. In this case, each cluster member must use different user accounts. By clicking on the corresponding magnifier icon, you can view how the Input File data would be distributed across the cluster members. If you do not use the split functionality, each cluster member will receive an entire Input File copy.

The distribution of users across the cluster members can also be modified manually; however, this is useful only if a cluster member is unavailable (marked with light red background). The cluster job can not be started. In this case, you can assign the unavailable cluster member users to other cluster members and then try to start the cluster job again. This redistribution may take a few seconds to complete.

Alternatively, the load test program can also be scheduled to be executed at a predefined time. However, the local Job Controller process must be available (running) at the predefined time because the scheduling entry for the cluster job is stored inside the Job Controller working directory, which the Job Controller itself monitors. If you have started the Job Controller implicitly by using the ZebraTester Console, you must keep the ZebraTester Console Window open so that the cluster job will be started ¹. ¹ This restriction can be avoided by installing the local Job Controller as a Windows Service or as a Unix Daemon.

After the cluster job has been scheduled, you can leave this menu by closing the window, and you can use later the Jobs menu to cancel or modify the schedule of this job.

Real-Time Cluster Job Statistics

The real-time statistics of a cluster job show the most important measured values, similar to the values shown in the Real-Time Statistic of Exec Agent Jobs. The cluster job itself contains Exec Agent jobs that the local cluster job controller has created. By clicking on a cluster member's magnifier icon, the corresponding Exec Agent job's real-time statistics can be displayed in its own window.

If you want to abort the cluster job, you must do it at this level, as this will also abort all Exec Agent jobs. Aborting a single Exec Agent job will not interrupt the cluster job.

The same applies to the statistics result file (*.prxres), which must be accessed at this level.

Loading the Statistics File of Cluster Jobs

The statistics results file of a cluster job contains the consolidated (merged) measurements for all cluster members. The calculations for merging the results are extensive; therefore, it may take up to 60 seconds to show the result file. The individual measurements of the Exec Agents are embedded separately inside the same consolidated result file.

The consolidated statistics results file is marked with a grey/blue background and is already selected for you, depending on your ZebraTester version.

Click on the Acquire Selected Files button to get you to the Load Test job's associated files.

By clicking on the magnifier icon, you can access the "*.out" and "*.err" files of the corresponding Exec Agent jobs.

Usually, you would work inside the Analyze Load Tests menu with the consolidated measurement results only. However, it is also possible to expand the measurement results to access the results of each Exec Agent job:

This feature can be used to check if all cluster members have measured approximately the same response times; however, variations in a range of ± 20% or more may be normal:


Load Test Jobs Menu

All load test programs started from the Project Navigator are always executed as "batch jobs" by an (external) Exec Agent process or by an Exec Agent Cluster. This means that it is not required to wait for the completion of a load test program on the “Execute Load Test” window: you can close the "Execute Load Test" window at any time, and you can check later the result, or the actual effort, of all load test jobs by using this menu.

If a load test job has been completed, you will acquire the corresponding statistic result file (*.prxres). If a load test job is still running, you are disposed to the job's temporary live-statistic window.

Input Fields

Display Cluster Jobs

shows all Exec Agent Cluster jobs.

Display Exec Agent Jobs of

allows selecting the Exec Agent from which a list of all load test jobs is displayed.

Clean-Up: Delete All Non-Running Jobs

deletes all jobs except running and scheduled jobs.\\.

Clean-Up: Delete Old Completed Jobs

deletes all completed jobs except the newest one. This button is only shown if, at minimum, two jobs have been completed.

Columns of the job list:

Item

Description

Item

Description

Job

Each job has its unique ID, which was automatically assigned when the job was defined. However, the ID is unique per Exec Agent. Cluster jobs have a known, separate ID (own enumeration counter).

[Search Icon]

Allows to acquire the statistic result file (.prxres) of an already completed load test job - or reconnects to the temporary statistic of the load test job if the job is still running – or allows to cancel the schedule of the job.

[Delete Icon]

Deletes all data (-files) of a completed load test job. Consider that you must first acquire the statistic result file (*.prxres) of the job before you delete all files of a job - otherwise, the job results are lost.

Date

Displays the date and time when the job has been defined or when the job has been completed, or - for scheduled jobs - the planned time when the job will be started.

State

Displays the current job state: configured (ready to run), scheduled, running, or completed. The state "???" means that the job data are corrupted - you should delete all jobs which have the state "???" because they delay the display of all jobs in this list.

Load Test Program & Arguments

Displays the name of the load test program and the arguments of the load test program.

Released from GUI(IP)

Displays the TCP/IP address (remote computer) from which the job has been initiated.

 

Load Test Program Arguments

Argument / Parameter

Meaning

Argument / Parameter

Meaning

u <number>

Number of concurrent users

d <seconds>

Planned test duration in seconds. 0 = unlimited

t <seconds>

Request timeout per URL call in seconds

sdelay <milliseconds>

Startup delay between creating concurrent users in milliseconds

maxloops <number>

Max. number of loops (repetitions of web surfing session) per user. 0 = unlimited

downlink <Kbps>

Network bandwidth limitation per concurrent user in kilobits per second for the downlink (web server to web browser)

uplink <Kbps>

Network bandwidth limitation per concurrent user in kilobits per second for the uplink (web browser to the webserver)

sampling <seconds>

Statistical sampling interval in seconds (interval-based sampling). Used for time-based overall diagrams like for example, the measured network throughput

percpage <percent>

Additional sampling rate in percent for response times of web pages (event-based sampling, each time when a web page is called)

percurl <percent>

Additional sampling rate in percent for response times of URL calls (event-based sampling, each time when a URL is called)

maxerrsnap <number>

Max. number of error snapshots per URL (per Exec Agent), 0 = unlimited

maxerrmem <megabytes>

Max. memory in megabytes which can be used to store error snapshots, -1 = unlimited

setuseragent "<text>"

Replaces the recorded value of the HTTP request header field User-Agent with a new value. The new value is applied for all executed URL calls.

nostdoutlog

Disables writing any date to the *.out file of the load test job. Note that the *.out file is nevertheless created but contains zero bytes.

dfl

Debug failed loops

dl

Debug loops

dh

Debug headers &amp; loops

dc

Debug content &amp; loops

dC

Debug cookies &amp; loops

dK

Debug keep-alive for re-used network connections &amp; loops

dssl

Debug information about the SSL protocol and the SSL handshake &amp; loops

multihomed

Forces the Exec Agent(s) to use multiple client IP addresses

ipperloop

Using this option combined with the option -multihomed effects that an own local IP address is used for each executed loop rather than for each simulated user. This option is considered only if also the option -multihomed is used.

ssl <version>

Use fixed SSL protocol version: v3, TLS, TLS11 or TLS12

sslcache <seconds>

The timeout of SSL cache in seconds. 0 = cache disabled

nosni

Disable support for TLS server name indication (SNI)

dnshosts <file-name>

Effects: The load test job uses a known DNS hosts file to resolve hostnames rather than the underlying operating system's hosts file. Note that you have to ZIP the hosts file together with the load test program's compiled class. To automate the ZIP, it's recommended to declare the hosts file as an external resource (w/o adding it to the CLASSPATH).

dnssrv <IP-name-server-1>[,<IP-name-server-N>]

Effects that the load test job uses specific (own) DNS server(s) to resolve hostnames – rather than to use the DNS library of the underlying operating system.

dnsenattl

Enable consideration of DNS TTL by using the received TTL-values from the DNS server(s). This option cannot be used in combination with the option -dnsperloop.

dnsfixttl <seconds>

Enable DNS TTL by using a fixed TTL-value of seconds for all DNS resolves. This option cannot be used in combination with the option -dnsperloop.

dnsperloop

Perform new DNS resolves for each executed loop. All resolves are stable within the same loop (no consideration of DNS TTL within a loop). This option cannot be used in combination with the options -dnsenattl or -dnsfixttl.

dnsstatistic

Effects that statistical data about DNS resolutions are measured and displayed in the load test result using a known DNS stack on the load generators. Note: There is no need to use this option if any other, more specific DNS option is enabled because all (other) DNS options also effect implicitly that statistical data about DNS resolutions are measured. If you use this option without any other DNS option, the (own) DNS stack on the load generators will communicate with the default configured DNS servers of the operating system - but without considering the "hosts" file.

tz <value>

Time zone (see Application Reference Manual)

annotation <text>

Comment about the test-run


Scripting Load Test Jobs

Several load test jobs can be started from the GUI at the same time. However, the GUI does not have the ability to automatically run sequences of load test jobs, synchronize load test jobs, or automatically start several jobs with a single mouse click.

To perform these kinds of activities, you must program load test job scripts written in the “natural” scripting language of your operating system (Windows: *.bat files, Unix: *.sh, *.ksh, *.csh … files). Inside these scripts, the PrxJob utility is used as the interface to the ZebraTester system. When the Windows version of ZebraTester is installed, the installation kit creates the directory ScriptExamples within the Project Navigator, and this directory contains some example scripts.

The PrxJob utility allows you to start load test jobs on the local as well as on a remote system. It also provides the capability to create cluster jobs, synchronize jobs, obtain the current state of jobs, and acquire the statistics result files of jobs. More information about the PrxJob utility can be found in the Application Reference Manual, Chapter 4.


Rerunning Load Tests Jobs (Job Templates)

Whenever a load test is started, an additional job definition template file is stored in the actual Project Navigator directory (in XML format). Such a job definition template file contains all configuration data needed to rerun the same load test job.

If you click the corresponding button of a job definition template (XML) file in Project Navigator, the load test job, inclusive of all of its input parameters, is automatically transferred to the Exec Agent or the Exec Agent Cluster and immediately ready-to-run.

In the screenshot below, the job was preconfigured to run from a cluster of defined servers with a predefined set of Load Test Program Arguments.

Additionally, if you wish to trigger several load test jobs simultaneously to be ready-to-run (by using only one mouse click), you can zip several templates to one zip archive. After this, click the corresponding button of the zip archive:

Example XML LoadTest Template

<?xml version="1.0" encoding="UTF-8"?> <loadTestTemplate> <proxySnifferVersion>V5.5-F</proxySnifferVersion> <loadTestProgramPath>/Applications/ZebraTester/MyTests/CL_Demo_FF_Demo.class</loadTestProgramPath> <startFromExecAgentName></startFromExecAgentName> <startFromClusterName>Cluster 1</startFromClusterName> <isPureJUnitLoadTest>false</isPureJUnitLoadTest> <executionPlanFilePath></executionPlanFilePath> <concurrentUsers>1</concurrentUsers> <testDuration>60</testDuration> <loopsPerUser>0</loopsPerUser> <pacingPerLoop>0</pacingPerLoop> <startupDelayPerUser>200</startupDelayPerUser> <downlinkBandwidth>0</downlinkBandwidth> <uplinkBandwidth>0</uplinkBandwidth> <requestTimeout>60</requestTimeout> <maxErrorSnapshots>-20</maxErrorSnapshots> <statisticSamplingInterval>15</statisticSamplingInterval> <percentilePageSamplingPercent>100</percentilePageSamplingPercent> <percentileUrlSamplingPercent>20</percentileUrlSamplingPercent> <percentileUrlSamplingPercentAddOption>0</percentileUrlSamplingPercentAddOption> <debugOptions></debugOptions> <additionalOptions></additionalOptions> <sslOptions>all</sslOptions> <pmaTemplateFileName></pmaTemplateFileName> <testRunAnnotation>test</testRunAnnotation> <userAgent>Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko</userAgent> <browserLan>en</browserLan> <enableBrowserCache>true</enableBrowserCache> <checkNewVersion>true</checkNewVersion> <browserCacheOptions></browserCacheOptions> <userInputFields/> </loadTestTemplate>

XML Load Test Template Attributes

Attribute Name

Description

Attribute Name

Description

loadTestProgramPath

Absolute file path to compiled load test program (*.class) or load test program ZIP archive

startFromExecAgentName

Name of the Exec Agent on which the load test is started (empty value if cluster job)

startFromClusterName

Name of the Exec Agent Cluster on which the load test is started (empty value if no cluster job)

concurrentUsers

Number of concurrent users

testDuration

Planned test duration in seconds (0 = unlimited)

loopsPerUser

Number of planned loops per user (0 = unlimited)

startupDelayPerUser

Startup delay per user in milliseconds

downlinkBandwidth

Downlink bandwidth per user in kilobits per second (0 = unlimited)

uplinkBandwidth

Uplink bandwidth per user in kilobits per second (0 = unlimited)

requestTimeout

Request timeout per URL call in seconds

maxErrorSnapshots

Limits the number of error snapshots taken during load test execution (0 = unlimited). Negative value: maximum memory in megabytes used to store all error snapshots, counted overall Exec Agents (recommended). Positive value: maximum number of error snapshots per URL, per Exec Agent (not recommended).

statisticSamplingInterval

Statistic sampling interval in seconds

percentilePageSamplingPercent

Additional sampling rate per Web page in percent (0..100)

percentileUrlSamplingPercent

Additional sampling rate per URL call in percent (0..100)

percentileUrlSamplingPercentAddOption

Additional URL sampling options per executed URLcall (numeric value):<br>0: no options<br>1: all URLperformance details (network connect time, request transmit time, …)2: request header3: request content (form data)4: request header &amp; request content5: response header6: response header &amp; response content7: all – but without response content8: all – full URL snapshot

debugOptions

Debug options: (string value)“-dl”: debug loops (including var handler)“-dh”: debug headers &amp; loops“-dc”: debug content &amp; loops“-dC”: debug cookies &amp; loops“-dK”: debug keep-alive &amp; loops“-dssl”: debug SSL handshake &amp; loops

additionalOptions

Additional options (string)

sslOptions

SSL/HTTPS options: (string value)<br>“all”: automatic SSL protocol detection (TLS preferred)“tls”: SSL protocol fixed to TLS“v3”: SSL protocol fixed to v3“v2”: SSL protocol fixed to V2

testRunAnnotation

Annotation for this test-run (string)

userInputFields

Label, variable name, and the default value of User Input Fields

 

Can't find what you're looking for? Send an E-mail to support@apica.io