You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+56-18Lines changed: 56 additions & 18 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -44,7 +44,7 @@ When calling `open-next build`, OpenNext **runs `next build`** to build the Next
44
44
45
45
#### Building the Next.js app
46
46
47
-
OpenNext runs the `build` script in your `package.json` file. Depending on the lock file found in the app, the corresponding packager manager will be used. Either `npm run build`, `yarn build`, or `pnpm build` will be run.
47
+
OpenNext runs the `build` script in your `package.json` file. Depending on the lock file found in the app, the corresponding packager manager will be used. Either `npm run build`, `yarn build`, or `pnpm build` will be run. For more on customizing the build command, see [overriding the build command](#custom-build-command).
Enabling this option will minimize all `.js` and `.json`files in the server function bundle using the [node-minify](https://github.com/srod/node-minify) library. This can reduce the size of the server function bundle by about 40%, depending on the size of your app. To enable it, simply run:
435
+
OpenNext runs the `build` script in your `package.json`by default. However, you can specify a custom build command if required.
Enabling this option can significantly help to reduce the cold start time of the server function. However, it's an **experimental feature**, and you need to opt-in to use it. Once this option is thoroughly tested and found to be stable, it will be enabled by default.
442
+
```ts
443
+
// JS
444
+
import { build } from"open-next/build.js";
442
445
443
-
## Debugging
446
+
awaitbuild({
447
+
buildCommand: "pnpm custom:build",
448
+
});
449
+
```
444
450
445
-
#### Function logs
451
+
#### Minify server function
446
452
447
-
To find the **server, image optimization, and warmer log**, go to the AWS CloudWatch console in the **region you deployed to**.
453
+
Enabling this option will minimize all `.js` and `.json` files in the server function bundle using the [node-minify](https://github.com/srod/node-minify) library. This can reduce the size of the server function bundle by about 40%, depending on the size of your app.
448
454
449
-
If the server function is **deployed to Lambda@Edge**, the logs will appear in the **region you are physically close to**. For example, if you deployed your app to `us-east-1` and you are visiting the app from in London, the logs are likely to be in `eu-west-2`.
450
-
451
-
#### Warmer function logs
455
+
```bash
456
+
# CLI
457
+
open-next build --minify
458
+
```
452
459
453
-
The logs from the warmer function provide insights into the results of the warming process.
-`sent` — The number of times the warmer invoked the server function using the Lambda SDK. This value should correspond to the `CONCURRENCY` set in the warmer function.
460
-
-`success` — The number of SDK calls that returned a 200 status code, indicating successful invocations.
461
-
-`uniqueServersWarmed` — This helps track any instances that responded unusually quickly and served multiple warming requests. As all SDK calls are made concurrently using `await Promise.all()`, this metric is useful for monitoring the number of unique warmed instances.
469
+
This feature is currently **experimental** and needs to be opted into. It can significantly decrease the server function's cold start time. Once it is thoroughly tested and its stability is confirmed, it will be enabled by default.
462
470
463
471
#### Debug mode
464
472
465
-
You can run OpenNext in debug mode by setting the `OPEN_NEXT_DEBUG` environment variable:
473
+
OpenNext can be executed in debug mode for bug tracking purposes.
466
474
467
475
```bash
476
+
# CLI
468
477
OPEN_NEXT_DEBUG=true npx open-next@latest build
469
478
```
470
479
480
+
```ts
481
+
// JS
482
+
import { build } from"open-next/build.js";
483
+
484
+
awaitbuild({
485
+
debug: true,
486
+
});
487
+
```
488
+
471
489
This does a few things:
472
490
473
491
1. Lambda handler functions in the build output will not be minified.
@@ -479,6 +497,26 @@ It is recommended to **turn off debug mode when building for production** becaus
479
497
1. Un-minified function code is 2-3X larger than minified code. This will result in longer Lambda cold start times.
480
498
1. Logging the event object on each request can result in a lot of logs being written to AWS CloudWatch. This will result in increased AWS costs.
481
499
500
+
## Debugging
501
+
502
+
#### Function logs
503
+
504
+
To find the **server, image optimization, and warmer log**, go to the AWS CloudWatch console in the **region you deployed to**.
505
+
506
+
If the server function is **deployed to Lambda@Edge**, the logs will appear in the **region you are physically close to**. For example, if you deployed your app to `us-east-1` and you are visiting the app from in London, the logs are likely to be in `eu-west-2`.
507
+
508
+
#### Warmer function logs
509
+
510
+
The logs from the warmer function provide insights into the results of the warming process.
-`sent` — The number of times the warmer invoked the server function using the Lambda SDK. This value should correspond to the `CONCURRENCY` set in the warmer function.
517
+
-`success` — The number of SDK calls that returned a 200 status code, indicating successful invocations.
518
+
-`uniqueServersWarmed` — This helps track any instances that responded unusually quickly and served multiple warming requests. As all SDK calls are made concurrently using `await Promise.all()`, this metric is useful for monitoring the number of unique warmed instances.
519
+
482
520
#### Opening an issue
483
521
484
522
To help diagnose issues, it's always helpful to provide a reproducible setup when opening an issue. One easy way to do this is to create a pull request (PR) and add a new page to the [benchmark app](#example) located in the `example` folder, which reproduces the issue. The PR will automatically deploy the app to AWS.
0 commit comments