Skip to content

Commit e16f26e

Browse files
*Update script and README
1 parent 3fb380a commit e16f26e

File tree

2 files changed

+103
-59
lines changed

2 files changed

+103
-59
lines changed

functions-int-tests/README.md

Lines changed: 101 additions & 58 deletions
Original file line numberDiff line numberDiff line change
@@ -1,94 +1,137 @@
11
# Testing functions bindings using E2E tests
22

3-
This readme explains the process of testing function bindings using E2E tests. Azure functions bindings are by default
4-
declarative and makes usage of this easy for developing applications. However, it is important to test the bindings to
3+
This readme explains the process of testing function bindings using E2E tests. Azure functions bindings are by default
4+
declarative and makes usage of this easy for developing applications. However, it is important to test the bindings to
55
validate the application behavior.
66

77
This utilizes [TestContainers](https://www.testcontainers.org/) to run the bindings in a containerized environment.
88

9-
## Setting up the tests
9+
## Prerequisites
1010

11-
To run the tests, you need to have Docker installed on your machine. Azure function base images to test the bindings are
12-
published on [DockerHub](https://hub.docker.com/_/microsoft-azure-functions) and on [Github](https://github.com/Azure/azure-functions-docker).
13-
For ease of running the tests the best option would be to use the base tools image corresponding to the platform. For example to
14-
run the tests for .NET 6.0, the image `4-dotnet6-core-tools` can be used.
15-
The test examples in this repo using Java 11 for tests. This can be easily changed to any other languages by using TestContainers for
16-
the language of choice.
11+
- **Docker** installed and running
12+
- **Java 11+** (for running the test harness)
13+
- **Maven 3.8+**
14+
- **.NET SDK** (for building the extension)
15+
- **Azure CLI** (`az`) for obtaining access tokens
16+
- A Kusto (Azure Data Explorer) cluster and database, set up using the KQL script at [`samples/set-up/KQL-Setup.kql`](../samples/set-up/KQL-Setup.kql)
1717

18-
The tests have the following high level steps, this assumes that the function app is already created and there are HTTP triggers configured :
18+
## Scripts
1919

20-
* Create a compose file using the base image to test with. If additional components are needed they can be orchestrated using a docker-compose file.
21-
* In the following example, we use the java image and add RabbitMQ (for trigger tests) and Azurite (for saving function state) to the container.
20+
All automation scripts are in the [`scripts/`](../scripts/) directory:
21+
22+
| Script | Description |
23+
|---|---|
24+
| `build-docker-image.sh` | Builds the Docker image with the Kusto extension (Linux) |
25+
| `BuildE2ETestImage.ps1` | Builds the Docker image with the Kusto extension (PowerShell) |
26+
| `run-e2e-tests.sh` / `.ps1` | Sets up environment for performance tests |
27+
| `run-functional-tests-e2e.sh` / `.ps1` | Full pipeline: build image → generate settings → run tests |
28+
29+
## How the tests work
30+
31+
The tests use a custom Docker image built on top of the [Azure Functions base image](https://hub.docker.com/_/microsoft-azure-functions)
32+
(`node:4-node20-core-tools`). The image includes Maven, Java, Python and all runtimes needed to test across languages.
33+
34+
### High-level flow
35+
36+
1. **Build the Docker image** — compiles the Kusto extension, generates a `DockerFile` from
37+
`samples/docker/Docker-template.dockerfile`, and builds the image with `docker build --no-cache`.
38+
2. **Generate `local.settings.json`** for each language sample with the Kusto connection string and correct
39+
`FUNCTIONS_WORKER_RUNTIME` value.
40+
3. **Start containers** via docker-compose (function host + Azurite + optional RabbitMQ).
41+
4. **Copy sample function apps** into the container and run them with `func start`.
42+
5. **Execute tests** against the function HTTP endpoints from the host via forwarded ports.
43+
6. **Assert results** by querying Kusto to validate data written/read by the bindings.
44+
45+
### Docker compose
46+
47+
The compose file orchestrates the function host alongside supporting services:
2248

2349
```yaml
24-
version: '3'
2550
services:
26-
baseimage:
27-
image: mcr.microsoft.com/azure-functions/java:4-dotnet6-core-tools
28-
hostname: func-az-kusto-base
29-
ports:
30-
- "7101:7101"
31-
rabbitmq:
32-
image: rabbitmq:3.11.9-management
33-
hostname: rabbitmq
34-
ports:
35-
- "7000:15672"
36-
- "7001:5672"
37-
azurite:
38-
image: mcr.microsoft.com/azure-storage/azurite
39-
hostname: azurite
40-
ports:
51+
baseimage:
52+
image: func-az-kusto-base:latest
53+
hostname: func-az-kusto-base
54+
ports:
55+
- "7101:7101"
56+
azurite:
57+
image: mcr.microsoft.com/azure-storage/azurite
58+
hostname: azurite
59+
ports:
4160
- "10000:10000"
4261
- "10001:10001"
4362
- "10002:10002"
4463
```
45-
* The compose environment is instantiated referencing the compose file and then started.
46-
``
47-
DockerEnvironment environment = new DockerComposeContainer<>(new File(path));
48-
environment.start();
49-
``
5064
51-
* Copy the function app to the container or use a volume mount for mounting the function app
52-
```
53-
containerState.copyFileToContainer(MountableFile.forHostPath(pathToFunctionApp),String.format("/src/samples-%s/", functionAppName));
54-
```
65+
### Container initialisation
5566
56-
* Run the function app in the container. Use the function core tools to run the app
57-
```
58-
containerState.execInContainer("func", "start", "--port", "7101", "--java" "--verbose");
59-
```
67+
Inside the container, two scripts handle setup:
6068
61-
* Since the ports are forwarded from localhost, the tests can be run against the function app using the localhost url.
62-
```
63-
String url = String.format("http://localhost:%d/api/%s", port, functionName);
69+
- **`init-functions.sh`** — copies the Kusto extension DLL into the extension bundle and registers it in `extensions.json`.
70+
- **`start-functions.sh`** — starts the function app for a given language (`-l node -p 7101`). Maps `node` to `--javascript` for the core tools.
71+
72+
## Running the E2E tests
73+
74+
### Option 1: Full automated pipeline
75+
76+
```bash
77+
# From the repository root
78+
scripts/run-functional-tests-e2e.sh <CLUSTER> <DATABASE>
6479
```
6580

66-
* The values inserted or retrieved can then be asserted against the expected values ( by selecting data from Kusto and validating the results)
81+
This builds the image, generates `local.settings.json` for every language sample, and runs `mvn clean gatling:test`.
6782

68-
## Running the tests
83+
The `CLUSTER` and `DATABASE` parameters are required and identify the Kusto cluster and database to test against.
84+
An access token is obtained automatically via `az account get-access-token`, or you can set the `ACCESS_TOKEN`
85+
environment variable beforehand.
6986

70-
In this example , the tests are set up for java and can be run through the maven lifecycle. The tests can be run using the following command
87+
### Option 2: Step by step
7188

72-
```
89+
```bash
90+
# 1. Build the Docker image
91+
scripts/build-docker-image.sh
92+
93+
# 2. Run the Java test harness directly
94+
cd functions-int-tests
7395
mvn clean test
7496
```
7597

76-
## Running the tests for performance tests
98+
### Option 3: PowerShell
7799

78-
This folder contains the performance tests for the bindings. The tests are run using [Gatling](https://gatling.io/).
79-
The tests are run using the following command. The setup is exactly as described for the E2E tests. The only difference is that the test
80-
runs use the gatling framework for applying load to the function app and validating the results.
100+
```powershell
101+
# Full pipeline
102+
scripts/run-functional-tests-e2e.ps1 -Cluster <CLUSTER> -Database <DATABASE>
103+
104+
# Build image only
105+
. scripts/BuildE2ETestImage.ps1
106+
BuildE2ETestImage -Acr <acr> -DockerPush $true
107+
```
108+
109+
## Performance / stress tests
110+
111+
The performance tests use [Gatling](https://gatling.io/) with the same containerized setup. They apply load to the
112+
function app and validate results under stress.
81113

82114
```bash
83-
mvn clean formatter:format gatling:test "-Dport=7105" "-Dlanguage=csharp" "-DrunDescription=.NETFunctions-StressTests" "-DrunTrigger=false"
115+
cd functions-int-tests
116+
mvn clean formatter:format gatling:test \
117+
"-Dport=7105" \
118+
"-Dlanguage=csharp" \
119+
"-DrunDescription=.NETFunctions-StressTests" \
120+
"-DrunTrigger=false"
84121
```
85122

86123
## Building a custom image
87124

88-
This folder contains steps to build a custom Docker image that can be catered to run against all language bindings. If this is the case that a
89-
custom image is needed the following steps can be followed. This assumes that you already have a container registry where the image can be pushed.
125+
To build and optionally push to a container registry:
90126

91127
```bash
92-
. .\BuildE2ETestImage.ps1
93-
BuildE2ETestImage -Acr <acr/container-registry> -DockerPush $true
94-
```
128+
# Linux
129+
scripts/build-docker-image.sh --acr myacr.azurecr.io --push
130+
131+
# PowerShell
132+
. scripts/BuildE2ETestImage.ps1
133+
BuildE2ETestImage -Acr myacr.azurecr.io -DockerPush $true
134+
```
135+
136+
The build script generates a `DockerFile` from `samples/docker/Docker-template.dockerfile`, builds with `--no-cache`,
137+
and tags as both `func-az-kusto-base:<date>` and `func-az-kusto-base:latest`.

samples/set-up/KQL-Setup.kql

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,8 @@
1313

1414
.clear table Products data
1515

16-
.create table Products ingestion json mapping 'item_to_product_json' '[{"Column": "ProductID", "Properties": {"Path": "$.ItemID"}},{"Column": "Name", "Properties": {"Path": "$.ItemName"}},{"Column": "Cost", "Properties": {"Path": "$.ItemCost"}}]'.create table Products ingestion json mapping 'item_to_product_json' '[{"Column": "ProductID", "Properties": {"Path": "$.ItemID"}},{"Column": "Name", "Properties": {"Path": "$.ItemName"}},{"Column": "Cost", "Properties": {"Path": "$.ItemCost"}}]'
16+
.create table Products ingestion json mapping 'item_to_product_json' '[{"Column": "ProductID", "Properties": {"Path": "$.ItemID"}},{"Column": "Name", "Properties": {"Path": "$.ItemName"}},{"Column": "Cost", "Properties": {"Path": "$.ItemCost"}}]'
17+
1718

1819
.show streamingingestion statistics | order by StartTime desc | take 10
1920

0 commit comments

Comments
 (0)