Skip to content
This repository was archived by the owner on May 20, 2025. It is now read-only.

Commit 3f4e1c5

Browse files
davemooreuwsHomelessDinosaurjyecusch
authored
add docs for monorepos (#588)
Co-authored-by: Ryan Cartwright <[email protected]> Co-authored-by: Jye Cusch <[email protected]>
1 parent 21f8c30 commit 3f4e1c5

File tree

3 files changed

+75
-0
lines changed

3 files changed

+75
-0
lines changed

dictionary.txt

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -196,6 +196,7 @@ todo
196196
todos
197197
transpiling
198198
ARN
199+
monorepos
199200

200201
^.+[-:_]\w+$
201202
[a-z]+([A-Z0-9]|[A-Z0-9]\w+)

src/pages/faq.mdx

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,10 @@ Add the environment variable to your `~/.zshrc` or `~/.bashrc` as:
3333
PULUMI_ACCESS_TOKEN=access_token
3434
```
3535

36+
## Does Nitric support monorepos?
37+
38+
Yes, Nitric supports monorepos through the custom runtime feature, which allows you to change the build context of your Docker build. For more information, see [custom containers](/reference/custom-containers). Alternatively, you can move your `nitric.yaml` to the root of your repository.
39+
3640
## Will I be locked-in to Nitric?
3741

3842
Nitric is designed with flexibility to avoid lock-in, including to Nitric. If the framework no longer serves you, you'll simply need to choose a new IaC and migrate your provisioning code. The Nitric framework and CLI are written in Go, and use the Pulumi Go Providers, so you may be able to avoid rewriting all of the provisioning code by lifting the provisioning code which Nitric has already built for you. If relevant, you'll also need to rebuild your CI pipelines to leverage the new IaC tooling you've chosen. Nitric doesn't have access to your data, so no data migration is needed.

src/pages/reference/custom-containers.mdx

Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -176,3 +176,73 @@ ENTRYPOINT ["/bin/main"]
176176
### Create an ignore file
177177

178178
Custom dockerfile templates also support co-located dockerignore files. If your custom docker template is at path `./docker/node.dockerfile` you can create an ignore file at `./docker/node.dockerfile.dockerignore`.
179+
180+
## Create a monorepo with custom runtimes
181+
182+
Nitric supports monorepos via the custom runtime feature, this allows you to change the build context of your docker build. To use a custom runtime in a monorepo, you can specify the `runtime` key per service definition as shown below.
183+
184+
<Note>Available in Nitric CLI version 1.45.0 and above</Note>
185+
186+
### Example for Turborepo
187+
188+
[Turborepo](https://turbo.build/) is a monorepo tool for JavaScript and TypeScript that allows you to manage multiple packages in a single repository. In this example, we will use a custom runtime to build a service in a monorepo using a custom dockerfile.
189+
190+
```yaml {{ tag: "root/backends/guestbook-app/nitric.yaml" }}
191+
name: guestbook-app
192+
services:
193+
- match: services/*.ts
194+
runtime: turbo
195+
type: ''
196+
start: npm run dev:services $SERVICE_PATH
197+
runtimes:
198+
turbo:
199+
dockerfile: ./turbo.dockerfile # the custom dockerfile
200+
context: ../../ # the context of the docker build
201+
args:
202+
TURBO_SCOPE: 'guestbook-api'
203+
```
204+
205+
```docker {{ tag: "root/backends/guestbook-app/turbo.dockerfile" }}
206+
FROM node:alpine AS builder
207+
ARG TURBO_SCOPE
208+
209+
# Check https://github.com/nodejs/docker-node/tree/b4117f9333da4138b03a546ec926ef50a31506c3#nodealpine to understand why libc6-compat might be needed.
210+
RUN apk add --no-cache libc6-compat
211+
RUN apk update
212+
# Set working directory
213+
WORKDIR /app
214+
RUN yarn global add turbo
215+
216+
# copy from root of the mono-repo
217+
COPY . .
218+
RUN turbo prune --scope=${TURBO_SCOPE} --docker
219+
220+
# Add lockfile and package.json's of isolated subworkspace
221+
FROM node:alpine AS installer
222+
ARG TURBO_SCOPE
223+
ARG HANDLER
224+
RUN apk add --no-cache libc6-compat
225+
RUN apk update
226+
WORKDIR /app
227+
RUN yarn global add typescript @vercel/ncc turbo
228+
229+
# First install dependencies (as they change less often)
230+
COPY .gitignore .gitignore
231+
COPY --from=builder /app/out/json/ .
232+
COPY --from=builder /app/out/yarn.lock ./yarn.lock
233+
RUN yarn install --frozen-lockfile --production
234+
235+
# Build the project and its dependencies
236+
COPY --from=builder /app/out/full/ .
237+
COPY turbo.json turbo.json
238+
239+
RUN turbo run build --filter=${TURBO_SCOPE} -- ./${HANDLER} -m --v8-cache -o lib/
240+
241+
FROM node:alpine AS runner
242+
ARG TURBO_SCOPE
243+
WORKDIR /app
244+
245+
COPY --from=installer /app/backends/${TURBO_SCOPE}/lib .
246+
247+
ENTRYPOINT ["node", "index.js"]
248+
```

0 commit comments

Comments
 (0)