Skip to content

How to Validate a Vaadin 8 Project with GitHub Actions

Tatu Lund edited this page Mar 7, 2026 · 2 revisions

How to Validate a Vaadin 8 Project with GitHub Actions

This project uses GitHub Actions to validate every commit of a multi‑module Vaadin 8 application. The workflow compiles all modules, runs unit and TestBench UI tests, caches Maven dependencies and publishes artifacts such as error screenshots and JUnit XML results.

This guide explains how the existing .github/workflows/validation.yaml is structured, why those choices matter for Vaadin 8, and how you can apply the same pattern in your own project.


1. Overview of the validation workflow

The workflow is defined in .github/workflows/validation.yaml and triggers on every push:

name: Java CI

on: [push]

env:
  VAADIN_PRO_KEY: ${{ secrets.PRO_KEY }}

jobs:
  build:
    runs-on: ubuntu-latest

    permissions:
      checks: write
      pull-requests: write

    steps:
      - uses: actions/checkout@v3
      - name: Set up JDK 21
        uses: actions/setup-java@v3
        with:
          java-version: '21'
          distribution: 'adopt'
      - name: Install Chrome
        uses: browser-actions/setup-chrome@latest
        with:
          chrome-version: stable
      - name: Cache Maven repository
        uses: actions/cache@v4
        with:
          path: ~/.m2/repository
          key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
          restore-keys: |
            ${{ runner.os }}-maven-
      - name: Build with Maven
        run: mvn --batch-mode --update-snapshots install -DskipTests
      - name: Run tests
        run: mvn --batch-mode verify -Pit -DghActions=true
      - name: Upload error screenshots
        if: always()
        uses: actions/upload-artifact@v4
        with:
          name: error-screenshots
          path: vaadincreate-ui/error-screenshots
      - name: Publish Test Results
        uses: EnricoMi/publish-unit-test-result-action@v2
        if: always()
        with:
          files: |
            vaadincreate-backend/target/surefire-reports/*.xml
            vaadincreate-components/target/surefire-reports/*.xml
            vaadincreate-components/target/failsafe-reports/TEST-*.xml
            vaadincreate-ui/target/surefire-reports/*.xml
            vaadincreate-ui/target/failsafe-reports/TEST-*.xml

Key ideas:

  • Single job runs on ubuntu-latest with JDK 21 (matching the Vaadin 8 modules in this project).
  • Chrome is installed so Vaadin TestBench UI tests can run headless in CI.
  • Maven dependencies are cached across builds to speed up Vaadin 8 compilation (GWT widgetsets, TestBench, Selenium, etc.).
  • Two Maven invocations: one for a fast install without tests, another for full verify (including the it profile) to run integration and UI tests.
  • Artifacts: error screenshots from UI tests and XML test results are uploaded so failures are easy to diagnose in the GitHub UI.

2. Dependency caching for Maven and Vaadin 8

Vaadin 8 projects, especially those using custom components and widgetsets (as in vaadincreate-components and vaadincreate-ui), tend to have heavy Maven builds that download many artifacts and compile GWT code.

The workflow uses actions/cache@v4 to cache the local Maven repository:

- name: Cache Maven repository
  uses: actions/cache@v4
  with:
    path: ~/.m2/repository
    key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
    restore-keys: |
      ${{ runner.os }}-maven-

Why this matters in Vaadin 8:

  • GWT compilation and Vaadin widgetset builds pull in many transitive dependencies; caching prevents re‑downloading them on every run.
  • Vaadin TestBench and Selenium drivers are fairly large dependencies; caching keeps CI times manageable.
  • The cache key includes a hash of all pom.xml files, so dependencies are automatically re‑resolved when you add or update Vaadin/TestBench versions.

When you replicate this pattern in another Vaadin 8 project, keep:

  • The cache path pointing at your local Maven repo (~/.m2/repository).
  • The key including a hash of all POMs so it updates when the dependency tree changes.

3. Compiling all modules with Maven

The root POM (vaadincreate-root) defines three modules:

  • vaadincreate-backend – backend services and data model.
  • vaadincreate-components – reusable Vaadin 8 add‑on components.
  • vaadincreate-ui – the actual WAR with Vaadin 8 UI.

The build step runs a full multi‑module install:

- name: Build with Maven
  run: mvn --batch-mode --update-snapshots install -DskipTests

This ensures:

  • All modules compile with the configured Java versions (11 for components, 21 for UI).
  • The Vaadin Maven plugin (in each module) can run widgetset and theme compilation as part of the build.
  • The -DskipTests flag keeps this first pass fast, focusing on compilation and packaging.

In your own project, you can keep this phase as a no‑tests smoke build and reserve the more expensive UI tests for the next step.


4. Running unit, integration and TestBench tests

The second Maven invocation runs the full test suite, including TestBench UI tests, via the it profile:

- name: Run tests
  run: mvn --batch-mode verify -Pit -DghActions=true

Important details for Vaadin 8 and TestBench:

  • verify phase runs:
    • Unit tests via Surefire in each module.
    • Integration/UI tests via Failsafe and TestBench where configured.
  • The it profile is used to enable integration tests that may rely on Jetty or TestBench; in this project it is used to start Jetty for the UI and run browser‑based tests.
  • The custom property -DghActions=true can be checked in tests or configuration to tweak behavior for CI (e.g., headless mode, different base URL, stricter timeouts).

The Vaadin 8 UI POM includes TestBench and Selenium dependencies:

<dependency>
    <groupId>com.vaadin</groupId>
    <artifactId>vaadin-testbench</artifactId>
    <version>${testbench.version}</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.seleniumhq.selenium</groupId>
    <artifactId>selenium-java</artifactId>
    <version>${selenium.version}</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>io.github.bonigarcia</groupId>
    <artifactId>webdrivermanager</artifactId>
    <version>${webdrivermanager.version}</version>
    <scope>test</scope>
</dependency>

Combined with the Install Chrome step in the workflow, this allows TestBench tests to run on GitHub‑hosted runners without bundling your own browser/driver binaries.

When adapting this pattern:

  • Make sure your TestBench tests can be configured for headless Chrome (for example via WebDriverManager and ChromeOptions).
  • Use profiles or system properties (like -Pit -DghActions=true) to separate local and CI behavior when needed.

5. Handling TestBench error screenshots and visual acceptance tests

Vaadin 8 TestBench tests in this project are configured to write error screenshots into:

  • vaadincreate-ui/error-screenshots

The workflow always uploads this directory as an artifact, even if tests fail:

- name: Upload error screenshots
  if: always()
  uses: actions/upload-artifact@v4
  with:
    name: error-screenshots
    path: vaadincreate-ui/error-screenshots

Why this is important for Vaadin 8 UI testing:

  • Visual regressions and layout issues are often easiest to diagnose from screenshots.
  • TestBench’s screenshot support integrates well with visual acceptance tests that compare against reference screenshots stored in the repo (see vaadincreate-ui/reference-screenshots and vaadincreate-components/reference-screenshots).
  • When a visual test fails, you get the failing screenshot as a downloadable artifact in the GitHub Actions run, which can be compared with the reference images.

Best practices:

  • Keep a clear folder convention, e.g. error-screenshots/ for failures and reference-screenshots/ committed to the repo.
  • Ensure your TestBench tests store screenshots with descriptive filenames so they are easy to match to specific tests.
  • Consider storing diff images (if your test harness generates them) alongside the error screenshots and upload those as artifacts as well.

6. Publishing JUnit test results to GitHub

The workflow uses EnricoMi/publish-unit-test-result-action to collect XML test results from all modules and surface them in the GitHub UI:

- name: Publish Test Results
  uses: EnricoMi/publish-unit-test-result-action@v2
  if: always()
  with:
    files: |
      vaadincreate-backend/target/surefire-reports/*.xml
      vaadincreate-components/target/surefire-reports/*.xml
      vaadincreate-components/target/failsafe-reports/TEST-*.xml
      vaadincreate-ui/target/surefire-reports/*.xml
      vaadincreate-ui/target/failsafe-reports/TEST-*.xml

This is especially helpful when mixing:

  • Unit tests for backend and components.
  • Integration/TestBench tests for the Vaadin UI (via Failsafe).

GitHub will show a consolidated test summary per workflow run, and link failing tests directly to their XML entries.

When adapting this setup:

  • Point files: to all Surefire and Failsafe XML locations in your modules.
  • Keep if: always() so results and screenshots are uploaded even on failing runs.

7. Adapting this pattern to your own Vaadin 8 project

To use a similar validation pipeline:

  1. Create a GitHub Actions workflow with steps to:
    • Check out the repository.
    • Set up a JDK compatible with your Vaadin 8 modules.
    • Install Chrome (or another browser) for TestBench.
    • Cache ~/.m2/repository with a key based on pom.xml hashes.
  2. Run a Maven build (mvn -B install -DskipTests) to compile all modules and Vaadin 8 widgetsets/themes.
  3. Run the full test suite (mvn -B verify with appropriate profiles) to execute unit, integration and TestBench tests.
  4. Upload error screenshots from your TestBench output directory as an artifact.
  5. Publish test results from all Surefire/Failsafe report directories using a test result action.
  6. Use environment variables and profiles (like VAADIN_PRO_KEY, -Pit, -DghActions=true) to keep CI‑specific configuration out of your production code.

With these pieces in place, every commit to your Vaadin 8 project is validated with fast feedback on compilation, backend logic, and UI behavior, including rich artifacts (screenshots and test reports) to debug issues quickly.

Are you observing flake tests in your GHA validation build? Then read this article next: How To Reduce Flaky Tests in Vaadin 8 TestBench

Clone this wiki locally