Jobs
The Jobs view provides an overview of and access to all running, scheduled and completed load test jobs.
Filter
The filter section allows you to configure which jobs to display in the table, based on a set of criteria.
Item | Description |
---|---|
Project | |
Instance | Load test Instance. |
Subscription | Type of Plans & Subscriptions. |
From | Start date for a time period. |
To | End date for a time period. |
Search | Free text search field. |
Running Jobs
The Running Jobs table presents aggregated statistics for currently running job(s).
View
Column | Description |
---|---|
| Expand list of connected performance monitoring agents. |
Job Id | Identifying job number. |
Scenario | Scenario used in the job. |
Clusters | Location cluster(s) used to generate load. |
User/Duration | Number of users and duration. |
Total URL Calls | The total number of failed URL calls since start of job. |
Total Page Views | The total number of page views made per minute. |
Network Throughput | The total network traffic which is generated by this load test job, measured in megabits per second. |
Web Transaction Rate | Number of web page transactions performed per second. |
Started Date | Timestamp for job start. |
Time Elapsed | Time since start of job. |
Progress/Status | Percentage progress bar and current status. |
Actions | Possible actions to perform on the job. |
Abort Jobs | Cancels running jobs. |
Running Agents
Agents
For each running job, you can view the connected Performance Agent by expanding the list.
View Agents
To show the configured agents for the job:
Click the expand button.
The list is shown:
Column | Description |
---|---|
Monitoring Agent | Agent name/identifier and network address. |
CPU Usage % | Percentage of CPU used. |
Physical Memory Available | Available memory. |
Memory Page/s | Memory pages per second swapped to or from disk. |
Mbit/s Transmitted | Outgoing traffic volume. |
Mbit/s Received | Incoming traffic volume. |
Physical Disk Time | Percentage of time spent reading/writing disks. |
The Scheduled Jobs table shows jobs that have been scheduled to run, either as soon as possible or at some later time.
Column | Description |
---|---|
Job Id | Identifying job number. |
Scenario | Scenario used in the job. |
Start Date | Timestamp for job start. |
Clusters | Location cluster(s) used to generate load. |
User | Number of users. |
Duration | Duration of test. |
Status | Current status of job. |
Actions | Possible actions to perform on the job. |
| Select/deselect all checks in the list. |
Delete Jobs | Remove jobs from scheduling. |
Scheduled Actions
The actions column contains buttons for any available actions you can perform on the job. Unavailable actions are not shown.
Icon | Action | Description |
---|---|---|
| Delete | Remove the scenario file. |
| Run | Run the job now, regardless of scheduling. |
| Runtime Options | Show the Runtime Options. |
Completed Jobs
The Completed Jobs table displays a list of completed jobs aggregated statistics for the currently running job(s).
Completed Jobs Table
The Completed Jobs table contains jobs that have been completed or aborted in the past.
Column | Description |
---|---|
Scenario Run ID/Job ID | Identifying Scenario Run or Job number. |
Scenario | The scenario used in the job. |
Date Started | The timestamp for job start. |
Clusters | Location cluster(s) used to generate load. |
Users | The number of users. |
Duration | Duration of the test. |
Status | Current status of the job. |
Actions | Possible actions to perform on the job. |
Select/deselect all checks in the list. | |
Delete Jobs | Remove jobs from scheduling. |
Available Actions for Completed Jobs
The actions column contains buttons for any available actions you can perform on the job. Unavailable actions are not shown (not all action icons will appear as below, depending on the job).
Icon | Action | Description |
---|---|---|
Delete | Remove the job. | |
Download | Save .prxres files test result file to the local computer. | |
Logs | View Logs. | |
Rerun Test | Schedule the test to Rerun. | |
Runtime Options | Show Show the Runtime Options. |
Scenario Results
The Scenario Test Result Page provides a fast overview of the test results.
How To Get There: Navigate to this page via Loadtest--> Jobs--> Completed Jobs table.
For every Completed Job which is a Scenario, there will be a green link for Result in that scenario’s row.
Following this link will lead to a Scenario Results Page that will display the Scenario Results Overview, starting with a Time Diagram
Time Diagram is shown when Scenario tests have been done or scheduled to run.
The above graphical diagram shows a timeline view of the Scenario Job; when it was done, scheduled and the result in terms of if it was successfully executed.
You can toggle (Hide/Show) the diagram
Note the Quick links to jump to the Years, Months Days, Hours that this scenario may have been run.
The color of the bars corresponds to the status of each Scenario run.
Successful (green): All test in the scenario was successfully started and finished
Partly Successful (orange): Some but not all tests in the scenario was started and finished.
Scheduled run (grey): Scenario Test is scheduled to run in the future
Failed runs (~red): All Tests in the Scenario failed to start.
Scripts included in the Scenario
This list gives an overview of all scripts/tests that is part of the selected Scenario.
For each Script Name, it shows what Cluster was used for the test (location), the Test Interval (Timeframe) for when the test was running, test Duration and the number of Total/Passed and Failed Transactions.
Page Response Time Graphs
There is a graph section for each test that shows response time for all pages in the test. Graphs show an overview of if there was any higher than expected response times or if there were any response time spikes, (as can be seen in the example below).A table under each graph presents Min, Max, Avg, Variance and Standard Deviation for each page.
The horizontal bars to the right of each page row give a quick view of the passed/Failed Transaction % rate and the Response Time Variation.
High variances in the column indicate that the response time was fluctuating and that it may warrant further investigation.
Links to two additional Tabs for additional actions
Results, https://apica-kb.atlassian.net/wiki/spaces/ALTDOCS/pages/20709431, where you can do a graphical analysis of the data
Page Response Time
Script Iteration Time
Network Throughput
Plus add other charts to compare prior runs, results, load levels, page combinations, etc.
Report, https://apica-kb.atlassian.net/wiki/spaces/ALTDOCS/pages/19169382, where you can create a report for this particular Scenario Result
Add
Company Logo
Rich-text
Tables
Images
Hyperlinks
Inline Code
Codeblocks
Analyze Scenario Results
Scenario Results now has two added tabs that allow you to do additional analysis of the load test results. This page describes the Analyze Tab.
The other Report Tab is covered in https://apica-kb.atlassian.net/wiki/spaces/ALTDOCS/pages/19169382.
Analyze Tab
This tab comes with three standard charts which can be added to, replaced, and modified as Page Response Time
Script Iteration Time
Network Throughput
Page Response Time
Other charts may be added for comparison and analysis as in the example below where
Page Response Times were added for different pages from the same scenario run instead of comparing all same page responses for different loads
Transaction Rate was compared to Network Throughput over time.
Each chart may be added to the Report tab and further formatted there.
Each of these charts may be edited using these icons found above each.
Icon | Meaning |
---|---|
Synchronize X-axes for all charts | |
Switch Y-Scale Format between Logarithmic and Linear | |
Access Chart Options for changing data/metrics/pages/charts | |
Toggle between standard compact chart size and full window width | |
|
Create a Report from a Scenario Result
Report Tab
For a Scenario Result, there is a report tab that allows reports to be created, saved, loaded, and maintained for distribution as well as exported to a PDF.
The Analyze Tab is covered in the page, https://apica-kb.atlassian.net/wiki/spaces/ALTDOCS/pages/19169347
RICH TEXT SUPPORT: This report will be a rich text document, with interactive charts within ALT, and can be augmented with images, company logos, hyperlinks, inline code, and even code blocks as indicated by the menu of icons above each text section.
The report can be saved underneath/loaded from various projects for better organization.
When the Report tab is first opened, there will be an empty, templated, section below the Diagram that will be where you will be able to add a new report.
LOAD EXISTING REPORT: If you have a report that is already saved, either as the one you want to review or use it as a template for a new one, you may load it.
SAMPLE REPORT: In this sample report mock-up, you can see an example of a simple report (edits to the chart images height are intended). Note that the position buttons at the bottom right of each section allow for changing where on the report, that section appears.:
EXPAND/DOWNLOAD/PRINT CHARTS: As you can see, the report’s charts are interactive within the ALT portal as well, allowing an expanded view or to be able to download or print any chart in the report.
ADDING NEW CHARTS: New charts can be added by selecting the Analyze Tab, selecting a chart to be added, and clicking the Add to Report button found at the bottom right of each chart.
SAVE/EXPORT: Options to Save your Report or Exporting it are available.
Where available, the Jobs view provides an overview of and access to all running, scheduled and completed load test jobs.
Logs
For jobs that have saved logs, you can view and download the logs. To view the logs, click the View Logs action button.
Rerun
Completed test can be scheduled for running again. It is possible to change project and instance before running. To schedule the test to run again, click the Rerun Test action button.
Item | Description | Comment |
---|---|---|
Scenario | Name of scenario. |
|
Number of Concurrent Users | Number of Virtual User. |
|
Loadtest Duration | How long to run the test. |
|
Ramp-up Time | Time to Ramp-Up. |
|
Locations | Locations as configured for the job. |
|
Project | Project for the job. |
|
Attach to | Instance to attach to. |
|
Test Instance | Name of instance to attach to. |
|
Runtime Options
For all jobs, you can review the runtime options. These provide an overview of all settings for the job. To view the runtime options, click the Runtime Options action button.
Item | Description |
|
---|---|---|
Test Instance | Connected Instance. |
|
Scenario |
|
|
Scenario | Scenario used in the job. |
|
Loadtest Options |
|
|
Number of Concurrent Users | Number of Virtual User in the test. |
|
Loadtest Duration | Total duration of the test. |
|
Rampup Time | Time to Ramp-Up. |
|
Run Multiple Sequential Tests | Used Execution Mode. |
|
Location |
|
|
Cluster | C Cluster used to generate load. |
|
Scenario Options |
|
|
User input Text Files | Connected Input Files. |
|
User Defined Vars | Variables defined in the scenario. |
|
Advanced Options |
|
|
Max Loops Per User | Maximum number of Load Testing Loop per Virtual User. |
|
Request Timeout (sec) | Time to wait for responses. |
|
Additional Options | Any additional configured options. |
|
Distribute Load on All Available Datacenters | Enabled Load Distribution |
|
Client Options |
|
|
User Agent | Client User-Agent. |
|
Network Bandwidth Downlink (Mbit/s): |
|
|
Desktop | Limit on downloaded desktop traffic. |
|
Mobile | Limit on downloaded mobile traffic. |
|
Network Bandwidth Uplink (Mbit/s): |
|
|
Desktop | Limit on uploaded desktop traffic. |
|
Mobile | Limit on uploaded mobile traffic. |
|
Client Side Monitoring | Enabled Client Side Monitor. |
|
DNS |
|
|
DNS Hosts file | Custom HOSTS File file. |
|
DNS Server | Custom DNS Server used for lookups. |
|
Resolve DNS for Each Executed Loop | Whether D DNS lookup is performed for each Load Testing Loop. |
|
DNS Translation file | Custom DNS Server translation file used. |
|
Reporting |
|
|
Email Preliminary Report To | Address for test reports. |
|
Reporting History | Limit on number of tests to include in test history. |
|
Execution |
|
|
Schedule | Type of scheduling for the job. |
|
Schedule Date | Date for schedule. |
|
Test Information |
|
|
Attached to Project | Project the test belongs to. |
|
Job Comment | Comments added to the job settings. |
|
Tags | Scenario Tags applied to the scenario. |
|
Cannot find what you're looking for? Send an E-mail to support@apica.io.