Results API
The @cypress/extract-cloud-results module provides the getAccessibilityResults utility which enables you to programmatically fetch your run's Accessibility results in a CI environment. It determines the Cypress run created for the given CI workflow and will return the Accessibility results associated with that run. The results will be returned once the Cypress run has finished and the Accessibility report has been processed.
This allows you to review the results within CI and to determine if the results are acceptable or need to be addressed before code changes can merge. It provides overall accessibility scores and violation details for the runs, as well as page- or component-level feedback. This supports a wide variety of needs related to alerting about failures in specific focus areas of the application or creating fine-grained regression monitoring according to the current standards being met by each page.
Examples and use casesโ
This page focuses on how the Results API works and what kinds of information can be accessed. For examples of how this can be used, see our higher-level guides:
- The pull requests and policies guide shows what it looks like to use the Results API to set a policy and fail a pull request.
- The guide for detecting and managing changes shows some other common use cases.
Supported CI Providersโ
Fetching Accessibility results for a run supports fetching results for the following CI providers. Please see the docs below for information on general setup.
- Azure (requires Cypress v13.13.1)
- CircleCI
- GitHub Actions
- GitLab
- Jenkins
- AWS CodeBuild
- Drone
Please reach out to Cypress Support to request support for a different provider.
Installationโ
Install the @cypress/extract-cloud-results module in your install step in CI.
npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
Do not check this module in as a dependency. You should install it separately outside of your normal module installation. Use --force to get the latest version.
If you check this in as a dependency, your installation will fail when we update the package.
Usageโ
1. Get the Resultsโ
Write a script using the getAccessibilityResults utility to retrieve the results and perform one or more assertions to verify if the changes are acceptable. This script will be executed in CI.
Basic exampleโ
This snippet uses the getAccessibilityResults() helper to log out the results. It assumes your Project ID and Record Key variable are set. The following should work in any of the supported CI Providers out of the box:
// Assuming these environment variables are set:
// CYPRESS_PROJECT_ID=your-id
// CYPRESS_RECORD_KEY=your-record-key
const { getAccessibilityResults } = require('@cypress/extract-cloud-results')
getAccessibilityResults().then((results) => {
// use `console.dir` instead of `console.log` because the data is nested
console.dir(results, { depth: Infinity })
})
How to assert that only known rules are failing in the runโ
The Cypress App repository uses the Results API to ensure no new violations have been introduced. You can reference this script as a real example.
const { getAccessibilityResults } = require('@cypress/extract-cloud-results')
/**
* The list of rules that currently have 1+ elements that have been flagged with
* violations within the Cypress Accessibility report that need to be addressed.
*
* Once the violation is fixed in the Accessibility report,
* the fixed rule should be removed from this list.
*
* View the Accessibility report for the Cypress run in the Cloud
* for more details on how to address these failures.
*/
const rulesWithExistingViolations = [
'aria-required-children',
'empty-heading',
'aria-dialog-name',
'link-in-text-block',
'list',
]
getAccessibilityResults({
projectId: '...', // optional if set from env
recordKey: '...', // optional if set from env
runTags: [process.env.RUN_TAGS], // required if recording multiple runs
}).then((results) => {
const { runNumber, accessibilityReportUrl, summary, rules } = results
const { total } = summary.violationCounts
console.log(
`Received ${summary.isPartialReport ? 'partial' : ''} results for run #${runNumber}.`
)
console.log(`See full report at ${accessibilityReportUrl}.`)
// write your logic to conditionally fail based on the results
if (total === 0) {
console.log('No Accessibility violations detected!')
return
}
const { critical, serious, moderate, minor } = summary.violationCounts
console.log(`${total} Accessibility violations were detected:`)
console.log(` - ${critical} critical`)
console.log(` - ${serious} serious`)
console.log(` - ${moderate} moderate`)
console.log(` - ${minor} minor.`)
const newRuleViolations = rules.filter((rule) => {
return !rulesWithExistingViolations.includes(rule.name)
})
if (newRuleViolations.length > 0) {
console.error(
'The following rules were violated that were previously passing:'
)
console.error(newRuleViolations)
throw new Error(
`${newRuleViolations.length} rule regressions were introduced and must be fixed.`
)
}
if (total < rulesWithExistingViolations.length) {
console.warn(
`It seems you have resolved ${rulesWithExistingViolations.length - total} rule(s). Remove them from the list of problematic rules so regressions are not introduced.`
)
}
console.log('No new Accessibility violations detected!')
})
getAccessibilityResults Argumentsโ
getAccessibilityResults uses the following attributes to identify the Cypress run and return the Accessibility results:
getAccessibilityResults({
// The Cypress project ID.
// Optional if the CYPRESS_PROJECT_ID env is set
// Can be explicitly passed to override the env var
projectId: string
// The project's record key.
// Optional if the CYPRESS_RECORD_KEY env is set
// Can be explicitly passed to override the env var
recordKey: string
// The run tags associated with the run.
// Required IF you are recording multiple Cypress runs from a single CI build.
// Pass the run tags you used when recording in each run
// See below for more information
runTags: string[]
})
Result Typesโ
The Accessibility results for the run are returned as an object containing the following data:
{
// The run number of the identified build.
runNumber: number
// The run url for the identified build.
runUrl: 'https://cloud.cypress.io/projects/:project_id/runs/:run_number'
// The status of the identified build.
runStatus: 'passed' | 'failed' | 'errored' | 'timedOut' | 'cancelled' | 'noTests'
// The url that deep links into the summarized Accessibility report for the identified build.
accessibilityReportUrl: 'https://cloud.cypress.io/[...]'
// The axe-core library version used when generating the Accessibility report.
// See https://github.com/dequelabs/axe-core. Example: 4.10.0
axeVersion: string
summary: {
// Indicates whether a complete Accessibility report was generated.
// For example, if a run was cancelled and the report expected to run
// for 20 specs, but only 10 ran, this would result in a partial report.
isPartialReport: boolean
// The total detected violations and the breakdown by rule severity.
violationCounts: {
// The count of unique rules that detected a violation.
total: number,
// The count of unique critical rules that detected a violation.
critical: number,
// The count of unique serious rules that detected a violation.
serious: number,
// The count of unique moderate rules that detected a violation.
moderate: number,
// The count of unique minor rules that detected a violation.
minor: number,
},
// The accessibility score for the run as a whole, to two decimal places
score: number,
// The count of distinct failed elements detected during the run, across all views
failedElements: number
}
// The list of violated rules.
rules: [{
// The name of the rule. See https://github.com/dequelabs/axe-core/blob/develop/doc/rule-descriptions.md.
name: string
// The likely impact the rule has on a user with a disability.
severity: 'critical' | 'serious' | 'moderate' | 'minor'
// The status of the rule for the run.
status: 'violation'
// The url that deep links the report for this specific rule violation.
accessibilityReportUrl: 'https://cloud.cypress.io/[...]'
}]
// This list of views with accessibility violations detected,
// and details of failed rules on each view
views: [
{
// The url that deep links the report for this view, with no rule-preselected
accessibilityReportUrl: 'https://cloud.cypress.io/[...]'
// The name of the view as it appears in the accessibility report
displayName: "/app/get-started/why-cypress",
// The accessibility score for this particular view
score: number
// The list of violated rules for this specific view
rules: Rule[]
}
]
}
2. Add to CI Workflowโ
In your CI workflow that runs your Cypress tests,
- Update your install job to install the
@cypress/extract-cloud-resultsmodule. - Pass in the necessary arguments to
getAccessibilityResults. - Add a new step to the job that runs your Cypress tests to verify the Accessibility results.
If you record multiple runs in a single CI build, you must record these runs using the --tag parameter and then call getAccessibilityResults with the runTags argument for each run.
This is necessary to identify each unique run and return a corresponding set of results. The tags are how each run is uniquely identified.
Example
- Let's imagine that within a single CI build you call
cypress run --recordmultiple times because you're running one set of tests against astagingenvironment, followed by aproductionenvironment. - In this scenario, you pass a different
--tagto each cypress runcypress run --record --tag stagingcypress run --record --tag production
- When calling
getAccessibilityResultsyou would then pass these same tags to get the unique set of results for each rungetAccessibilityResults({ runTags: ['staging']})getAccessibilityResults({ runTags: ['production']})
Example Job Workflow Update:โ
- GitHub Actions
- GitLab
- Jenkins
- Azure
- CircleCI
- AWS CodeBuild
- Drone
name: My Workflow
on: push
env:
CYPRESS_RECORD_KEY: ${{ secrets.CYPRESS_RECORD_KEY }}
jobs:
run-cypress:
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@v4
- name: install
run: npm install
- name: Run
run: npx cypress run --record
+ - name: Verify Accessibility Results
+ run: |
+ npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ node ./scripts/verifyAccessibilityResults.js
name: Run Cypress Tests
image: node:latest
stages:
- test
run-cypress:
stage: test
secrets:
CYPRESS_RECORD_KEY:
vault: vault/cypressRecordKey
script:
- npm install
- npx cypress run --record
+ - npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ - node ./scripts/verifyAccessibilityResults.js
pipeline {
agent {
docker {
image 'cypress/base:22.15.0'
}
}
environment {
CYPRESS_PROJECT_ID: 'xxxx'
CYPRESS_RECORD_KEY = credentials('cypress-record-key')
}
stages {
stage('build and test') {
steps {
sh 'npm ci'
sh 'npx cypress run --record'
}
}
+ stage('Verify Accessibility Results') {
+ steps {
+ sh 'npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz'
+ sh 'node ./scripts/verifyAccessibilityResults.js'
+ }
+ }
}
}
jobs:
- job: run_tests
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool@0
inputs:
versionSpec: '20.x'
displayName: 'Install Node.js'
- script: npm i
displayName: 'Install npm dependencies'
- script: npx cypress run --record
displayName: 'Run Cypress tests'
env:
# avoid warnings about terminal
TERM: xterm
CYPRESS_RECORD_KEY: $(CYPRESS_RECORD_KEY)
+ - script: |
+ npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ node ./scripts/verifyAccessibilityResults.js
+ displayName: 'Verify Accessibility Results'
+ env:
+ CYPRESS_PROJECT_ID: $(CYPRESS_PROJECT_ID)
+ CYPRESS_RECORD_KEY: $(CYPRESS_RECORD_KEY)
version: 2.1
jobs:
linux-test:
docker:
- image: cypress/base:22.15.0
working_directory: ~/repo
steps:
- checkout
- run: npm install
- run: npx run cypress:run --record
+ - run: npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ - run: node ./scripts/verifyAccessibilityResults.js
workflows:
version: 2
tests:
jobs:
- run-cypress
phases:
install:
runtime-versions:
nodejs: latest
commands:
# Set COMMIT_INFO variables to send Git specifics to Cypress Cloud when recording
# https://docs.cypress.io/app/continuous-integration/overview#Git-information
- export COMMIT_INFO_BRANCH="$(git rev-parse HEAD | xargs git name-rev |
cut -d' ' -f2 | sed 's/remotes\/origin\///g')"
- export COMMIT_INFO_MESSAGE="$(git log -1 --pretty=%B)"
- export COMMIT_INFO_EMAIL="$(git log -1 --pretty=%ae)"
- export COMMIT_INFO_AUTHOR="$(git log -1 --pretty=%an)"
- export COMMIT_INFO_SHA="$(git log -1 --pretty=%H)"
- export COMMIT_INFO_REMOTE="$(git config --get remote.origin.url)"
- npm ci
pre_build:
commands:
- npm run cypress:verify
- npm run cypress:info
build:
commands:
- CYPRESS_INTERNAL_ENV=staging CYPRESS_PROJECT_ID=[slug] npx cypress run --record --key [KEY]
+ post_build:
+ commands:
+ - npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
+ - CYPRESS_INTERNAL_ENV=staging CYPRESS_PROJECT_ID=[slug] CYPRESS_RECORD_KEY=[KEY] node ./scripts/verifyAccessibilityResults.js
kind: pipeline
name: default
environment:
CYPRESS_PROJECT_ID: example_project_slug
CYPRESS_RECORD_KEY:
from_secret: example_record_key_secret
steps:
- name: test
image: node:latest
commands:
- npm install
- npx cypress run --record
* - name: validate
* image: node:latest
* commands:
* - npm install --force https://cdn.cypress.io/extract-cloud-results/v1/extract-cloud-results.tgz
* - node ./scripts/verifyAccessibilityResults.js
Required CI environment variablesโ
The @cypress/extract-cloud-results helper cross-references some environment variables from where it is executed with ones that were present when a Cypress Cloud run was recorded. This allows for automatically detecting the correct Cloud run when the Results API is invoked from the same CI context as a given run (as is the case in the above examples).
For more complex setups, or for local iteration on your Results API handler code, it can be useful to know what variables Cypress is looking for so that you can make sure they are passed through where they are needed.
Likewise, if you want to use the Results API locally to pull the data for a specific run (within the last 7 days), you can set these variables locally to match what was present in CI.
Local development exampleโ
If you executed a run in GitHub Actions and it was recorded to Cypress Cloud, you would set these 4 environment variables to replicate the context of that run locally and execute your local handler script. This is a great way to iterate on your script and verify everything is working as expected, without having to integrate anything in CI. It's also useful for debugging.
CYPRESS_PROJECT_ID=AAA
CYPRESS_RECORD_KEY=BBB
GITHUB_ACTIONS=true
GITHUB_RUN_ID=111
GITHUB_RUN_ATTEMPT=0
node verifyAccessibilityResults.js
The Results API will then look for the Cypress Cloud run that matches this run ID. If there is more than one Cypress Cloud run found for that GitHub Actions Run, you can pass run tags to narrow down to one run's report.
Supported CI Provider Overviewโ
Each CI provider has a unique combination of components, patterns, and environment variables that must be interpreted by this module.
GitHub Actionsโ
Reference: https://docs.github.com/en/actions/learn-github-actions/understanding-github-actions
Essential environment variablesโ
GITHUB_ACTIONS- Presence identifies the environment as a GitHub Actions environment.GITHUB_RUN_ID- Value uniquely identifies a GitHub Actions workflow instance. Value does not change as jobs in the workflow are re-executed.GITHUB_RUN_ATTEMPT- Value identifies the workflow instance's attempt index. Value is incremented each time jobs are re-executed.
Full environment variable reference: https://docs.github.com/en/actions/learn-github-actions/variables#default-environment-variables
Prerequisitesโ
- The run to validate and this module's validation script are being executed within the same workflow.
- The module script is always executed after the run to validate has been created. This can be achieved by either:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
needs: [job-name]option in the config), or - Executing the module script in serial with the cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
GitLab Pipelinesโ
Reference: https://docs.gitlab.com/ee/ci/pipelines/
Essential environment variablesโ
GITLAB_CI- Presence identifies the environment as a GitLab CI environmentCI_PIPELINE_ID- Value uniquely identifies a GitLab pipeline workflow. This value does not change as jobs in the pipeline are retried.CI_JOB_NAME- Value uniquely identifies a single job name within a pipeline. Ex.run-e2eCI_JOB_ID- Value uniquely identifies an execution instance of a job. This value will change each time a job is executed/re-executed.
Full environment variable reference: https://docs.gitlab.com/ee/ci/variables/predefined_variables.html
Prerequisitesโ
- The run to validate and this module's validation script are being executed within the same pipeline.
- The module script is always executed after the run to validate has been created. This can be achieved by:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
needs: [job-name]option in the config), or - Executing the module script in a separate job that is executed in a lower stage than the job that records the Cypress run, or
- Executing the module script in serial with the cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
Jenkinsโ
Reference: https://www.jenkins.io/doc/
Jenkins is heavily customizable through the usage of plugins, which limits the amount of assumptions we can make about available environment variables and overall behavior.
We have implemented Jenkins support within this module using the broadest set of available default values. For the purposes of this documentation, though, we will discuss terms related to Jenkins Pipeline support: https://www.jenkins.io/doc/book/pipeline/getting-started/
Essential termsโ
Essential environment variablesโ
JENKINS_HOME- Presence identifies the environment as a Jenkins environmentBUILD_URL- Value uniquely identifies a Jenkins job execution, including name and id characteristics.
Full environment variable reference: https://www.jenkins.io/doc/book/pipeline/jenkinsfile/#using-environment-variables
Prerequisitesโ
- The run to validate and this module's validation script are being executed within the same job.
- The module script is always executed after the run to validate has been created. This can be achieved by executing the module script in serial with the cypress recording in the same job.
Azureโ
Note: Cypress v13.13.1 is the earliest Cypress release that records the environment variables necessary for this module to identify runs in an Azure environment. Previous Cypress versions are not supported in Azure pipelines.
Essential environment variablesโ
TF_BUILDandAZURE_HTTP_USER_AGENT- Combined presence identifies the environment as a Azure pipeline environment.SYSTEM_PLANID- Value uniquely identifies a pipeline run. Value does not change as jobs within the pipeline are retried from failure.SYSTEM_JOBID- Value uniquely identifies a job execution. Value changes each time a job is retried from failure, in conjunction with theSYSTEM_JOBATTEMPTbeing incremented.SYSTEM_JOBATTEMPT- Value identifies the pipelines shared attempt index. Value is incremented when jobs are retried from failure.
Full environment variable reference: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
Prerequisitesโ
- The run to validate and this module's validation script are being executed within the same pipeline run (i.e. they share a
SYSTEM_PLANIDvalue). - The module script is always executed after the run to validate has been created. This can be achieved by either:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
dependsOn: [job-name]option in the config), or - Executing the module script in serial with the Cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
CircleCIโ
Reference: https://circleci.com/docs/about-circleci/
Note: Cypress v13.13.1 is the earliest Cypress release that records the environment variables necessary for this module to identify runs in an CircleCI environment. Previous Cypress versions are not supported in CircleCI pipelines.
Essential environment variablesโ
CIRCLECI- Presence identifies the environment as a CircleCI environmentCIRCLE_PIPELINE_ID- Value uniquely identifies a CircleCI pipeline, created on push or manually triggered through the UI. This value does not change as workflows within the pipeline are re-executed.CIRCLE_WORKFLOW_ID- Value uniquely identifies an instance of a workflow's execution within a pipeline. This value will be updated upon each workflow execution; in other words, retrying a workflow from failure from the Circle UI will create a new workflow with a newCIRCLE_WORKFLOW_IDvalue available to the jobs executed within it.CIRCLE_WORKFLOW_JOB_ID- Value uniquely identifies an execution instance of a named job within a workflow instance.
Full environment variable reference: https://docs.gitlab.com/ee/ci/variables/predefined_variables.html
Prerequisitesโ
- The run to validate and this module's validation script are being executed within the same pipeline and workflow.
- The module script is always executed after the run to validate has been created. This can be achieved by:
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
requires: [job-name]option in the config), or - Executing the module script in serial with the cypress recording in the same job.
- Executing the module script in a separate job that is dependent upon the job that records the run (using the
AWS CodeBuildโ
Reference: https://docs.aws.amazon.com/codebuild/
Essential environment variablesโ
CODEBUILD_BUILD_ID- Presence identifies the environment as an AWS CodeBuild environment. Value uniquely identifies a build.
Full environment variable reference: https://docs.aws.amazon.com/codebuild/latest/userguide/build-env-ref-env-vars.html
Prerequisites
- The run to validate and this module's validation script are being executed within the same build.
- The module script is always executed after the run to validate has been created. This can be achieved by executing the module script in serial with the cypress recording in the same build.
Droneโ
Reference: https://docs.drone.io/pipeline/overview/
Essential environment variablesโ
DRONE- Presence identifies the environment as an Drone environment.DRONE_BUILD_NUMBER- Value uniquely identifies a Drone build.
Full environment variable reference: https://docs.drone.io/pipeline/environment/reference/
Prerequisitesโ
- The run to validate and this module's validation script are being executed within the same build.
- The module script is always executed after the run to validate has been created. This can be achieved by executing the module script in serial with the cypress recording in the same build.
In order to iterate on your verification script and see everything working without putting code into your CI environment, it can be useful to simulate the CI context for a specific Cypress run locally. This can save a lot of time when getting started.