You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This article describes using *service containers* in Azure Pipelines. A *container* provides a simple and portable way to run a service. Service containers let you automatically create, network, and manage the lifecycles of services that your pipelines depend on.
16
+
This article describes using service containers in Azure Pipelines. If your pipeline requires the support of one or more services, you might need to create, connect to, and clean up the services per [job](phases.md). For example, your pipeline might run integration tests that require access to a newly created database and memory cache for each job in the pipeline.
17
17
18
-
If your pipeline requires the support of one or more services, you might need to create, connect to, and clean up the services per [job](phases.md). For example, your pipeline might run integration tests that require access to a newly created database and memory cache for each job in the pipeline.
18
+
A *service container* provides a simple and portable way to run a service. A service container is accessible only to the job that requires it.
19
19
20
-
A service container is accessible only to the job that requires it. Service containers work with any kind of job, but are most commonly used with [container jobs](container-phases.md).
20
+
Service containers let you automatically create, network, and manage the lifecycles of services that your pipelines depend on. Service containers work with any kind of job, but are most commonly used with [container jobs](container-phases.md).
21
21
22
22
>[!NOTE]
23
23
>Classic pipelines don't support service containers.
24
24
25
-
26
25
## Conditions and limitations
27
26
28
27
- Service containers must define a `CMD` or `ENTRYPOINT`. The pipeline runs `docker run` with no arguments for the provided container.
@@ -129,7 +128,7 @@ steps:
129
128
130
129
## Ports
131
130
132
-
Specifying `ports` isn't required if your job is running in a container, because containers on the same Docker network automatically expose all ports to each other by default. Jobs that run directly on the host require `ports` to access the service container.
131
+
Jobs that run directly on the host require `ports` to access the service container. Specifying `ports` isn't required if your job is running in a container, because containers on the same Docker network automatically expose all ports to each other by default.
133
132
134
133
A port takes the form `<hostPort>:<containerPort>` or just `<containerPort>` with an optional `/<protocol>` at the end. For example, `6379/tcp` exposes `tcp` over port `6379`, bound to a random port on the host machine.
135
134
@@ -170,11 +169,11 @@ services:
170
169
```
171
170
172
171
>[!NOTE]
173
-
>Microsoft-hosted pools don't persist volumes between jobs because the host machine is cleaned up after each job.
172
+
>Microsoft-hosted pools don't persist volumes between jobs, because the host machine is cleaned up after each job.
174
173
175
174
## Multiple containers with services example
176
175
177
-
The following example has a Django Python web container connected to PostgreSQL and MySQL database containers.
176
+
The following example pipeline has a Django Python web container connected to PostgreSQL and MySQL database containers.
178
177
179
178
- The PostgreSQL database is the primary database, and its container is named `db`.
180
179
- The `db` container uses volume `/data/db:/var/lib/postgresql/data`, and passes three database variables to the container via `env`.
0 commit comments