What you'll learn
- How to run Cypress tests with AWS CodeBuild as part of CI/CD pipeline
- How to parallelize Cypress test runs within AWS CodeBuild
With AWS CodeBuild Amazon Web Services (AWS) offers developers "...a fully managed build service that compiles your source code, runs unit tests, and produces artifacts that are ready to deploy" allowing teams to "pay only for the build time you use".
Detailed documentation is available in the AWS CodeBuild Documentation.
The example below is basic CI setup and job using the AWS CodeBuild to run Cypress tests within the Electron browser. This AWS CodeBuild configuration is placed within
## buildspec.yml version: 0.2 phases: install: runtime-versions: nodejs: latest commands: - npm ci pre_build: commands: - npm run cy:verify - npm run cy:info build: commands: - npm run start:ci & - npx cypress run --record
Try it out
How this buildspec works:
- On push to this repository, this job will provision and start AWS-hosted Amazon Linux instance with Node.js for running the outlined
buildfor the declared commands within the
commandssection of the configuration.
- AWS CodeBuild will checkout our code from our GitHub repository.
- Finally, our
- Install npm dependencies
- Start the project web server (
- Run the Cypress tests within our GitHub repository within Electron.
Testing in Chrome and Firefox with Cypress Docker Images
As of version 0.2, CodeBuild does not provide a way to specify a custom image for single build configurations. One way to solve this is using an AWS CodeBuild build-list strategy.
AWS CodeBuild offers a build-list strategy of different job configurations for a single job definition.
The Cypress team maintains the official Docker Images for running Cypress locally and in CI, which are built with Google Chrome and Firefox. For example, this allows us to run the tests in Firefox by passing the
--browser firefox attribute to
Cypress Amazon Public ECR
The images are available in the following Amazon ECR Public Galleries:
Choosing the right Docker Image
For end-to-end tests on a CI provider like AWS CodeBuild, the Cypress 'browsers' Amazon ECR Public Gallery contains the images to use.
What's the difference in the images?
base Docker images are used by the
included images for the base operating system and set of initial dependencies, but does not install Cypress or additional browsers.
browsers images extend a
base image and installs one or more browsers such as Chrome or Firefox.
included images extend a
browsers image and installs a specific version of Cypress and adds a Docker entrypoint for the
cypress run command. These images are for testing a containerized version of Cypress in a project during local development and are not used in CI environments.
## buildspec.yml version: 0.2 ## AWS CodeBuild Batch configuration ## https://docs.aws.amazon.com/codebuild/latest/userguide/batch-build-buildspec.html ## Define build to run using the "cypress/browsers:node12.14.1-chrome85-ff81" image from the Cypress Amazon ECR Public Gallery batch: fast-fail: false build-list: - identifier: cypress-e2e-tests env: variables: IMAGE: public.ecr.aws/cypress-io/cypress/browsers:node12.14.1-chrome85-ff81 phases: install: runtime-versions: nodejs: latest commands: - npm ci pre_build: commands: - npm run cy:verify - npm run cy:info build: commands: - npm run start:ci & - npx cypress run --record --browser firefox
Caching Dependencies and Build Artifacts
Caching with AWS CodeBuild directly can be challenging.
The Build caching in AWS CodeBuild document offers details on local or Amazon S3 caching.
Per the documentation, "Local caching stores a cache locally on a build host that is available to that build host only". This will not be useful during parallel test runs.
The "Amazon S3 caching stores the cache in an Amazon S3 bucket that is available across multiple build hosts". While this may sound useful, in practice the upload of cached dependencies can take some time. Furthermore, each worker will attempt to save it's dependency cache to Amazon S3, which increases build time significantly.
Beyond the scope of this guide, but AWS CodePipeline may be of use to cache the initial source, dependencies and build output for use in AWS CodeBuild jobs using AWS CodePipeline Input and Output Artifacts.
Reference the AWS CodePipeline integration with CodeBuild and multiple input sources and output artifacts sample example for details on how to configure a CodePipeline with an output artifact.
AWS CodeBuild offers a build-matrix strategy for declaring different job configurations for a single job definition. The build-matrix strategy provides an option to specify a container image for the job. Jobs declared within a build-matrix strategy can run in parallel which enables us run multiples instances of Cypress at same time as we will see later in this section.
The Cypress team maintains the official Docker Images for running Cypress locally and in CI, which are built with Google Chrome and Firefox. This allows us to run the tests in Firefox by passing the
--browser firefox attribute to
The following configuration with
--record options to Cypress requires a subscription to the Cypress Dashboard.
Parallelizing the build
To setup multiple containers to run in parallel, the
build-matrix configuration uses a set of variables (
WORKERS) with a list of items specific to each group for the build.
The fields are delimited by a pipe (
|) character as follows:
## Group Name | Browser | Specs | Cypress Configuration options (optional) 'UI - Chrome - Mobile|chrome|cypress/tests/ui/*|viewportWidth=375,viewportHeight=667'
build-matrix will run all permutations delimited items.
batch: fast-fail: false build-matrix: # ... dynamic: env: # ... variables: CY_GROUP_SPEC: - 'UI - Chrome|chrome|cypress/tests/ui/*' - 'UI - Chrome - Mobile|chrome|cypress/tests/ui/*|viewportWidth=375,viewportHeight=667' - 'API|chrome|cypress/tests/api/*' - 'UI - Firefox|firefox|cypress/tests/ui/*' - 'UI - Firefox - Mobile|firefox|cypress/tests/ui/*|viewportWidth=375,viewportHeight=667'
During the install phase, we utilize shell scripting with the cut command to assign values from the delimited
CY_GROUP_SPEC passed to the worker into shell variables that will be used in the
build phase when running
batch: # ... phases: install: commands: - CY_GROUP=$(echo $CY_GROUP_SPEC | cut -d'|' -f1) - CY_BROWSER=$(echo $CY_GROUP_SPEC | cut -d'|' -f2) - CY_SPEC=$(echo $CY_GROUP_SPEC | cut -d'|' -f3) - CY_CONFIG=$(echo $CY_GROUP_SPEC | cut -d'|' -f4) - npm ci ## ...
To parallelize the runs, we need to add an additional variable to the build-matrix strategy,
batch: fast-fail: false build-matrix: # ... dynamic: env: # ... variables: CY_GROUP_SPEC: # ... WORKERS: - 1 - 2 - 3 - 4 - 5
WORKERS array is filled with filler (or dummy) items to provision the desired number of CI machine instances within the build-matrix strategy and will provide 5 workers to each group defined in the
Finally, the script variables are passed to the call to
phases: install: # ... build: commands: - npm start:ci & - npx cypress run --record --parallel --browser $CY_BROWSER --ci-build-id $CODEBUILD_INITIATOR --group "$CY_GROUP" --spec "$CY_SPEC" --config "$CY_CONFIG"
Using the Cypress Dashboard with AWS CodeBuild
In the AWS CodeBuild configuration we have defined in the previous section, we are leveraging three useful features of the Cypress Dashboard:
- In-depth and shareable test reports.
- Visibility into test failures via quick access to error messages, stack traces, screenshots, videos, and contextual details.
- Integrating testing with the pull-request (PR) process via commit status check guards and convenient test report comments.
- Detecting flaky tests and surfacing them via Slack alerts or GitHub PR status checks.
Organizing and consolidating multiple
cypress runcalls by labeled groups into a single report within the. Cypress Dashboard. In the example above we use the
--group "UI - Chrome"flag (for the first group) to organize all UI tests for the Chrome browser into a group labeled "UI - Chrome" inside the Cypress Dashboard report.
Cypress Real World Example with AWS CodeBuild
A complete CI workflow against multiple browsers, viewports and operating systems is available in the Real World App (RWA).