-
Notifications
You must be signed in to change notification settings - Fork 0
How to Validate a Vaadin 8 Project with GitHub Actions
This project uses GitHub Actions to validate every commit of a multi‑module Vaadin 8 application. The workflow compiles all modules, runs unit and TestBench UI tests, caches Maven dependencies and publishes artifacts such as error screenshots and JUnit XML results.
This guide explains how the existing .github/workflows/validation.yaml is structured, why those choices matter for Vaadin 8, and how you can apply the same pattern in your own project.
The workflow is defined in .github/workflows/validation.yaml and triggers on every push:
name: Java CI
on: [push]
env:
VAADIN_PRO_KEY: ${{ secrets.PRO_KEY }}
jobs:
build:
runs-on: ubuntu-latest
permissions:
checks: write
pull-requests: write
steps:
- uses: actions/checkout@v3
- name: Set up JDK 21
uses: actions/setup-java@v3
with:
java-version: '21'
distribution: 'adopt'
- name: Install Chrome
uses: browser-actions/setup-chrome@latest
with:
chrome-version: stable
- name: Cache Maven repository
uses: actions/cache@v4
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-
- name: Build with Maven
run: mvn --batch-mode --update-snapshots install -DskipTests
- name: Run tests
run: mvn --batch-mode verify -Pit -DghActions=true
- name: Upload error screenshots
if: always()
uses: actions/upload-artifact@v4
with:
name: error-screenshots
path: vaadincreate-ui/error-screenshots
- name: Publish Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
files: |
vaadincreate-backend/target/surefire-reports/*.xml
vaadincreate-components/target/surefire-reports/*.xml
vaadincreate-components/target/failsafe-reports/TEST-*.xml
vaadincreate-ui/target/surefire-reports/*.xml
vaadincreate-ui/target/failsafe-reports/TEST-*.xmlKey ideas:
-
Single job runs on
ubuntu-latestwith JDK 21 (matching the Vaadin 8 modules in this project). - Chrome is installed so Vaadin TestBench UI tests can run headless in CI.
- Maven dependencies are cached across builds to speed up Vaadin 8 compilation (GWT widgetsets, TestBench, Selenium, etc.).
-
Two Maven invocations: one for a fast
installwithout tests, another for fullverify(including theitprofile) to run integration and UI tests. - Artifacts: error screenshots from UI tests and XML test results are uploaded so failures are easy to diagnose in the GitHub UI.
Vaadin 8 projects, especially those using custom components and widgetsets (as in vaadincreate-components and vaadincreate-ui), tend to have heavy Maven builds that download many artifacts and compile GWT code.
The workflow uses actions/cache@v4 to cache the local Maven repository:
- name: Cache Maven repository
uses: actions/cache@v4
with:
path: ~/.m2/repository
key: ${{ runner.os }}-maven-${{ hashFiles('**/pom.xml') }}
restore-keys: |
${{ runner.os }}-maven-Why this matters in Vaadin 8:
- GWT compilation and Vaadin widgetset builds pull in many transitive dependencies; caching prevents re‑downloading them on every run.
- Vaadin TestBench and Selenium drivers are fairly large dependencies; caching keeps CI times manageable.
- The cache key includes a hash of all
pom.xmlfiles, so dependencies are automatically re‑resolved when you add or update Vaadin/TestBench versions.
When you replicate this pattern in another Vaadin 8 project, keep:
- The cache path pointing at your local Maven repo (
~/.m2/repository). - The key including a hash of all POMs so it updates when the dependency tree changes.
The root POM (vaadincreate-root) defines three modules:
-
vaadincreate-backend– backend services and data model. -
vaadincreate-components– reusable Vaadin 8 add‑on components. -
vaadincreate-ui– the actual WAR with Vaadin 8 UI.
The build step runs a full multi‑module install:
- name: Build with Maven
run: mvn --batch-mode --update-snapshots install -DskipTestsThis ensures:
- All modules compile with the configured Java versions (11 for components, 21 for UI).
- The Vaadin Maven plugin (in each module) can run widgetset and theme compilation as part of the build.
- The
-DskipTestsflag keeps this first pass fast, focusing on compilation and packaging.
In your own project, you can keep this phase as a no‑tests smoke build and reserve the more expensive UI tests for the next step.
The second Maven invocation runs the full test suite, including TestBench UI tests, via the it profile:
- name: Run tests
run: mvn --batch-mode verify -Pit -DghActions=trueImportant details for Vaadin 8 and TestBench:
-
verifyphase runs:- Unit tests via Surefire in each module.
- Integration/UI tests via Failsafe and TestBench where configured.
- The
itprofile is used to enable integration tests that may rely on Jetty or TestBench; in this project it is used to start Jetty for the UI and run browser‑based tests. - The custom property
-DghActions=truecan be checked in tests or configuration to tweak behavior for CI (e.g., headless mode, different base URL, stricter timeouts).
The Vaadin 8 UI POM includes TestBench and Selenium dependencies:
<dependency>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-testbench</artifactId>
<version>${testbench.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>${selenium.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.github.bonigarcia</groupId>
<artifactId>webdrivermanager</artifactId>
<version>${webdrivermanager.version}</version>
<scope>test</scope>
</dependency>Combined with the Install Chrome step in the workflow, this allows TestBench tests to run on GitHub‑hosted runners without bundling your own browser/driver binaries.
When adapting this pattern:
- Make sure your TestBench tests can be configured for headless Chrome (for example via WebDriverManager and ChromeOptions).
- Use profiles or system properties (like
-Pit -DghActions=true) to separate local and CI behavior when needed.
Vaadin 8 TestBench tests in this project are configured to write error screenshots into:
vaadincreate-ui/error-screenshots
The workflow always uploads this directory as an artifact, even if tests fail:
- name: Upload error screenshots
if: always()
uses: actions/upload-artifact@v4
with:
name: error-screenshots
path: vaadincreate-ui/error-screenshotsWhy this is important for Vaadin 8 UI testing:
- Visual regressions and layout issues are often easiest to diagnose from screenshots.
- TestBench’s screenshot support integrates well with visual acceptance tests that compare against reference screenshots stored in the repo (see
vaadincreate-ui/reference-screenshotsandvaadincreate-components/reference-screenshots). - When a visual test fails, you get the failing screenshot as a downloadable artifact in the GitHub Actions run, which can be compared with the reference images.
Best practices:
- Keep a clear folder convention, e.g.
error-screenshots/for failures andreference-screenshots/committed to the repo. - Ensure your TestBench tests store screenshots with descriptive filenames so they are easy to match to specific tests.
- Consider storing diff images (if your test harness generates them) alongside the error screenshots and upload those as artifacts as well.
The workflow uses EnricoMi/publish-unit-test-result-action to collect XML test results from all modules and surface them in the GitHub UI:
- name: Publish Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
files: |
vaadincreate-backend/target/surefire-reports/*.xml
vaadincreate-components/target/surefire-reports/*.xml
vaadincreate-components/target/failsafe-reports/TEST-*.xml
vaadincreate-ui/target/surefire-reports/*.xml
vaadincreate-ui/target/failsafe-reports/TEST-*.xmlThis is especially helpful when mixing:
- Unit tests for backend and components.
- Integration/TestBench tests for the Vaadin UI (via Failsafe).
GitHub will show a consolidated test summary per workflow run, and link failing tests directly to their XML entries.
When adapting this setup:
- Point
files:to all Surefire and Failsafe XML locations in your modules. - Keep
if: always()so results and screenshots are uploaded even on failing runs.
To use a similar validation pipeline:
-
Create a GitHub Actions workflow with steps to:
- Check out the repository.
- Set up a JDK compatible with your Vaadin 8 modules.
- Install Chrome (or another browser) for TestBench.
- Cache
~/.m2/repositorywith a key based onpom.xmlhashes.
-
Run a Maven build (
mvn -B install -DskipTests) to compile all modules and Vaadin 8 widgetsets/themes. -
Run the full test suite (
mvn -B verifywith appropriate profiles) to execute unit, integration and TestBench tests. - Upload error screenshots from your TestBench output directory as an artifact.
- Publish test results from all Surefire/Failsafe report directories using a test result action.
-
Use environment variables and profiles (like
VAADIN_PRO_KEY,-Pit,-DghActions=true) to keep CI‑specific configuration out of your production code.
With these pieces in place, every commit to your Vaadin 8 project is validated with fast feedback on compilation, backend logic, and UI behavior, including rich artifacts (screenshots and test reports) to debug issues quickly.
Are you observing flake tests in your GHA validation build? Then read this article next: How To Reduce Flaky Tests in Vaadin 8 TestBench