LogoLogo
Reliability HubAPI DocsPlatform
  • Welcome to Steadybit
  • Quick Start
    • First Steps
    • Compatibility
    • Install Agent and Extensions
    • Run an Experiment
    • Deploy Example Application
  • Concepts
    • Actions
    • Discovery
    • Query Language
  • Install and Configure
    • Install Agent
      • Architecture
      • Install on Kubernetes
      • Install on Linux Hosts
      • Install using Docker Compose
      • Install on Amazon ECS
      • Extension Registration
      • Using Mutual TLS for Extensions
      • Configuration Options
      • Agent State
      • Agent API
    • Install On-Prem Platform
      • Install on Minikube
      • Advanced Agent Authentication
      • Configuration Options
      • Maintenance & Incident Support
      • Syncing Teams via OIDC Attribute
    • Manage Environments
    • Manage Teams and Users
      • Users
      • Teams
      • Permissions
    • Manage Experiment Templates
  • Use Steadybit
    • Experiments
      • Design
      • Run
      • Run History
      • Schedule
      • Variables
      • Emergency Stop
      • Share
        • Templates
        • Duplicate
        • File
      • OpenTelemetry Integration
    • Explorer
      • Landscape
      • Targets
      • Advice
    • Reporting
  • Integrate with Steadybit
    • Extensions
      • Anatomy of an Extension
      • Extension Installation
      • Extension Kits
      • Available Extensions
    • API
      • Interactive API Documentation
    • CLI
    • Badges
    • Webhooks
      • Custom Webhooks
      • Preflight Webhooks
    • Preflight Actions
    • Slack Notifications
    • Audit Log
    • Hubs
  • Troubleshooting
    • How to troubleshoot
    • Common fixes
      • Extensions
      • Agents
      • On-prem platform
Powered by GitBook

Extension Docs

  • ActionKit
  • DiscoveryKit
  • EventKit

More Resources

  • Reliability Hub
  • API Docs
On this page
  • Experiment Runs
  • Number of Runs
  • Attack Types
  • Trigger
  • Result
  • Result (Completed vs. Failed)
  • Issues Discovered
  • Issues Fixed
  • Experiments
  • Number of Experiments
  • Creation Channel
  • Creation Method
  • Others
  • Users
  • Teams
  • Environments

Was this helpful?

Edit on GitHub
  1. Use Steadybit

Reporting

Last updated 1 month ago

Was this helpful?

Steadybit's integrated reporting feature provides you with a comprehensive overview of your experiments, experiment runs and your adoption of Steadybit in terms of users, teams and environments.

You can click on the labels below any report to filter the data shown in the graphs.

All charts can be downloaded as png or pdf files.

Experiment Runs

The Experiment Runs report gives you an overview of all experiment runs that have been executed.

You can filter the reports by the following criteria:

  • Timeframe

  • Teams

  • Environments

Number of Runs

Find out how many experiments your teams have run in total.

Attack Types

Identify which attacks your teams have used most frequently.

Trigger

Check out what typically triggers an experiment run, e.g., API, CLI, UI, or schedule.

Result

Drill down into the experiment runs by the result and compare the numbers of completed, canceled, failed, and errored experiment runs.

Result (Completed vs. Failed)

Compare the portion of completed experiment runs to failed experiment runs to identify the frequency of identifying issues.

Issues Discovered

Identify how many experiment runs turned from completed to failed. We count experiment failures that were immediately preceded by a completed experiment run.

Issues Fixed

Identify how many experiment runs turned from failed to completed. We count experiment runs completed that were immediately preceded by a failed experiment run.

Experiments

The Experiments report gives you an overview of experiments that have been created in your environment.

You can filter the report by the following criteria:

  • Timeframe

  • Teams

  • Environments

Number of Experiments

Find out how many experiments your teams have designed in total.

Creation Channel

Identify which channel is used the most across your teams to create an experiment: UI, API, or CLI

Creation Method

Identify which method is used the most across your teams to create an experiment: From scratch, template, or advice

Others

These reports give you an overview of the adoption of Steadybit.

You can filter the reports by timeframe.

Users

Identify the progress you have made to roll out Steadybit in your organization by seeing the number of invited users.

Teams

Easily report on the numbers of teams having access to a safe Chaos Engineering in your organization.

Environments

Find out how many environments you have created to roll out a safe Chaos Engineering across your organization.

Example of a report showing experiment run results