This tool lets you rapidly create prototype forms from a simple description using Generative AI.
The generated prototypes use GOV.UK Design System components and best practices.
You can test prototypes live, share them with others, and download them to run them locally. You can also edit prototypes to make iterative improvements.
This is an exploratory proof-of-concept developed by the AI & Innovation Lab within Capgemini's Digital Excellence team to showcase the potential for AI to transform user research.
See the user guides for more information about how the application works. These are also available when the application is running.
If you would like to contribute, please see the instructions in docs/CONTRIBUTING.md. Notable changes will be documented in docs/CHANGELOG.md.
If you want to contact the maintainers directly, please complete this form.
- The user describes the form they want in plain English.
- A GenAI LLM takes this and a JSON schema, and uses them to generate a JSON representation of the form's structure that adheres to the schema (see example).
- The JSON structure of the form is then used to generate Nunjucks template files of the prototype.
- The generated template files are then rendered live for the user to try out, or can be downloaded in a ZIP file to run locally or incorporate into an existing project.
GPBP.2601.Demo.for.GitHub.webm
The project uses Express.js v5 with Node.js v20. It's written in TypeScript. Tests and LLM evaluations use the Jest testing framework.
It connects to an OpenAI LLM; we have been using GPT-4.1-mini running in Azure. Alternatively, you can use a local LLM when running the application with Docker Compose. Note that testing so far has shown that local LLMs struggle to generate a valid JSON representation of the form structure.
It uses MongoDB to store data about users, prototypes, and workspaces in a NoSQL database.
To run the application you can either:
- Install MongoDB and Node.js manually, deploy an OpenAI LLM, and run the application locally.
- Use Docker Compose to run the application with MongoDB and optionally a local LLM.
In both cases you'll need to setup environment variables as described below.
You can access the running application at http://localhost:3001.
You can access the MongoDB database through MongoDB Compass or another MongoDB client using the mongodb://admin:password123@localhost:27017/?directConnection=true connection string.
- Install Docker Desktop, which is required to create a local MongoDB Atlas deployment.
- Install the MongoDB Atlas CLI.
- Run
atlas auth loginto authenticate with Atlas. - Run
atlas deployments setup gpbp --type LOCAL --mdbVersion 8.0 --port 27017 --username admin --password password123to create a local MongoDB v8 deployment. - You should be able to see the MongoDB deployment container running in Docker Desktop.
- Install Node version manager (nvm).
- Install the latest version of Node.js v20 with
nvm install 20and switch to it withnvm use 20. - Check Node JS is ready with the right version using
node --version. - Copy the example environment file with
cp .env.example .envand fill out your environment variables; see below for details. Usemongodb://admin:password123@127.0.0.1:27017/gpbp?directConnection=true&authSource=adminfor theMONGODB_URIvariable. - Run
npm install --ignore-scriptsto install the dependencies safely. - Run the application with
npm run startand visit http://localhost:3001.
To run with the local LLM model:
- Install Docker Compose, likely as part of Docker Desktop.
- Enable Docker Model Runner as described in the Docker docs.
- Consider allocating more resources to Docker Desktop in the Docker Desktop settings.
- Consider pulling the model image beforehand with
docker model pull ai/gpt-oss.
To run without the local LLM model, remove the models section under the app service and the top-level models service, both in compose.yaml.
To run the application with Docker Compose:
- Copy the example environment file with
cp .env.example .envand fill out your environment variables; see below for details. Usemongodb://admin:password123@mongodb/gpbp?directConnection=true&authSource=adminfor theMONGODB_URIvariable. - Run
docker compose up --buildto build and start the application and MongoDB containers. - Visit http://localhost:3001.
The following environment variables are expected in .env, copied from .env.example:
APPLICATIONINSIGHTS_CONNECTION_STRING- the connection string for Azure Application Insights, can be left empty to disable.EMAIL_ADDRESS_ALLOWED_DOMAIN- the domain to allow for email addresses, e.g.example.com. If set, only email addresses with this domain will be allowed.EMAIL_ADDRESS_ALLOWED_DOMAIN_REVEAL- eithertrueorfalse. If set totrue, the allowed domain will be revealed to users when they sign up. If set tofalse, the allowed domain will not be revealed. This has no effect ifEMAIL_ADDRESS_ALLOWED_DOMAINis not set.LOG_USER_ID_IN_AZURE_APP_INSIGHTS- eithertrueorfalse. Whether to log the user ID in Azure Application Insights.MONGODB_URI- the connection string for MongoDB.MONGO_INITDB_ROOT_USERNAME- the root username for MongoDB, for Docker Compose setups.MONGO_INITDB_ROOT_PASSWORD- the root password for MongoDB, for Docker Compose setups.NODE_ENV- eitherdevelopmentorproduction, defaultproduction. When in production, the OpenTelemetry Instrumentation with Azure App Insights is enabled, the HSTS header is enabled, and the rate limiting headers are disabled.OPENAI_API_KEY- the API key to access the OpenAI API. This will be ignored when using the local LLM model in Docker Compose.OPENAI_BASE_URL- the base URL for the OpenAI API. This will be ignored when using the local LLM model in Docker Compose.OPENAI_MODEL_ID- the OpenAI model ID to query. This will be ignored when using the local LLM model in Docker Compose.RATE_LIMITER_ENABLED- eithertrueorfalse. Whether to enable rate limiting.RATE_LIMITER_MAX_REQUESTS- the maximum number of requests allowed per user in the rate limit window.RATE_LIMITER_WINDOW_MINUTES- the time window in minutes for the rate limit.SESSION_SECRET- the secret used to sign session cookies.SUGGESTIONS_ENABLED- eithertrueorfalse. Whether to suggest follow-up prompts to the user to modify their prototype.
The application uses an OpenAI LLM. The configuration for the model must be provided in the .env file, which includes the API key, endpoint, and model name.
Visit the Azure OpenAI documentation for more information on how to set up an OpenAI model in Azure. Alternatively, a model hosted elsewhere that is compatible with the OpenAI API can be used.
The project is structured as follows:
.cspell/– Custom dictionary for spelling checks..github/– GitHub configuration files (Actions workflows, templates, Dependabot configuration)..vscode/– Visual Studio Code workspace extensions and settings.data/– Example data, schemas, and project files for the ZIP download of the prototype.docs/– Project documentation and a user help guide.evals/– Automated tests for evaluating the LLM's output against expected form structures.jest/– Jest configuration and setup files for testing.migrations/– Mongoose database migration scripts.public/– Static assets not provided by third-parties.src/– Application source code (Express routes, utilities, business logic, data models).- Tests are in a
__tests__folder within each source folder.
- Tests are in a
views/– Nunjucks templates for all pages and components.
The entry point for the application is server.ts, which sets up the Express server, middleware, and routes.
Database migrations are managed using ts-migrate-mongoose. Migration scripts are located in the migrations/ folder.
To allow migrations to connect to the MongoDB database within Docker Compose, update the MONGODB_URI in your .env file to mongodb://admin:password123@127.0.0.1:27017/gpbp?directConnection=true&authSource=admin, then run docker compose up mongodb to start the MongoDB container only.
Run npx migrate up to run all pending migrations.
Migrations that have already been applied are tracked in the migrations collection in the MongoDB database.
Migrations cannot be rolled back; please backup your database before running migrations if necessary.
A Dockerfile is provided to build the application into a Docker image. You may need to enable host networking in Docker Desktop settings to run this directly, so it can connect to the MongoDB container.
You can build and run the image with the following commands:
# Build the Docker image
docker build -t gov-prototype-by-prompt .
# Run the Docker container
docker run --network host --env-file .env gov-prototype-by-promptWe use Dependabot to automate dependency updates. It creates pull requests to update dependencies in package.json and package-lock.json files, as well as in GitHub Actions.
The configuration for Dependabot is in the .github/dependabot.yml file.
This project is not affiliated with the UK Government.
At the time of writing, if your site or service is not part of GOV.UK then it must not [source]:
- identify itself as being part of GOV.UK
- use the crown or GOV.UK logotype in the header
- use the GDS Transport typeface
- suggest that it’s an official UK government website if it's not
No assessment has been made of this application's compliance with UK GDPR or other data protection or privacy laws. No guarantees are made regarding the security of the application.
The list of common passwords has been sourced from Have I Been Pwned and is used to prevent users from using common passwords.
This project is licensed under the terms of the MIT License.