Browser Testing

Browser Synthetic Monitoring allows you to simulate real-user interactions with your web applications across various browsers, devices, and locations. These tests ensure your applications are accessible, responsive, and performing as expected under different scenarios. This guide outlines how to configure and utilize Middleware Browser Synthetic Monitoring effectively.


Overview

Browser tests are scenarios executed by Middleware on your web applications. They run at configurable intervals from multiple locations worldwide, across multiple browsers and devices. These tests verify:

  • Your application's uptime and response behavior.
  • That predefined conditions or assertions are met.

Middleware also supports testing applications behind authentication. If this is relevant, follow the dedicated guide and share feedback with the Middleware Synthetic Monitoring team to enhance support for critical use cases.


Test Configuration

You can create a test using one of the following methods:

Build a Test from Scratch

If you prefer complete customization, you can start with a blank template:

  1. Starting URL: Enter the URL where your test scenario begins.
    Example: https://docs.middleware.io

  2. Test Name: Provide a descriptive name for your test.

  3. Environment and Tags:

    • Specify the environment (e.g., PROD, STAGING).
    • Add additional tags in <KEY>:<VALUE> format for filtering and organization.
Test Details
  1. Browsers and Devices:
    Select the browsers (e.g., Chrome, Firefox, Edge) and devices (e.g., Laptop Large, Tablet, Mobile Small) to test your application on.
    • Laptop Large: 1440 x 1100 pixels
    • Tablet: 768 x 1020 pixels
    • Mobile Small: 320 x 550 pixels
Browser and Devices
  1. Test Locations:
    • Choose from Middleware’s global locations (e.g., Americas, EMEA, APAC) for public-facing tests.
Browser and Devices
  1. Test Frequency:
    Configure test intervals (e.g., every 5 minutes, daily, or weekly).

  2. Save and Record:
    Click Save & Edit Recording to save your test and proceed to record steps.


Advanced Options

Request Options

  • Disable CORS/CSP:
    Prevent browser test failures caused by CORS (Cross-Origin Resource Sharing) or CSP (Content Security Policy) restrictions. Enable Disable CORS or Disable CSP as needed.
  • Request Headers:
    Add or override default headers. Example: Specify a custom User-Agent for your browser tests.
  • Cookies:
    Add custom cookies in the format Set-Cookie. Define one cookie per line.
  • HTTP Authentication:
    Use Basic, Digest, or NTLM authentication by providing a username and password. Credentials are used in all steps of the test.
Advance Options

Alert Conditions

Define the conditions under which a test sends notifications.
For example:

  • An alert triggers if a specific assertion fails for X minutes in n out of N locations.
  • Retry a failing test X times before marking it as failed for a location.
    Use the Middleware API to customize retry intervals or conditions further.

Notifications

  • Customize alert messages using Markdown.
  • Specify renotification intervals or disable renotifications.
  • Notify specific team members or services using Slack, Gmail, Opsgenie, etc.

Recording Test Steps

Middleware provides tools to record browser interactions as test steps.

  1. Download Test Recorder Extension:
    Install the Middleware Test Recorder extension for Google Chrome. Use this link to install the extension.
    Link: Middleware Synthetic Recorder
  2. Start Recording:
    Use the extension to capture clicks, inputs, and other interactions.
  3. Switch Tabs:
    Record interactions across tabs for multi-tab user journeys. Middleware automatically replays these steps during execution.
  4. Assertions:
    End your browser test by adding an assertion to verify that the desired state is achieved (e.g., checking for specific text or an element).
  5. Special Actions:
    Enhance your test accuracy by incorporating special actions like scroll, hover, and wait. Additionally, use key events to make the test more precise.
Advance Options

Best Practices

  1. Global Test Locations:
    Use Middleware's managed locations to test public websites from key regions (e.g., Americas, EMEA, APAC).
  2. Private Networks:
    Use private locations to test internal-facing applications securely.
  3. Frequent Testing:
    Configure tests at shorter intervals for mission-critical applications.
  4. Error Debugging:
    Use recorded test results, screenshots, and logs to debug issues effectively.

Locations

Middleware supports the following managed locations for public website testing:

  • Americas:
    Examples: Northern Virginia (AWS), São Paulo (AWS), Northern California (AWS).
  • APAC:
    Examples: Tokyo (AWS), Sydney (AWS), Mumbai (AWS).
  • EMEA:
    Examples: London (AWS), Frankfurt (AWS), Cape Town (AWS).