Skip to content
This repository was archived by the owner on May 20, 2025. It is now read-only.

Commit a44777b

Browse files
committed
add spend note
1 parent d6eee65 commit a44777b

File tree

1 file changed

+9
-0
lines changed

1 file changed

+9
-0
lines changed

docs/guides/python/serverless-llama.mdx

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -235,6 +235,15 @@ config:
235235
ephemeral-storage: 1024
236236
```
237237
238+
<Note>
239+
Nitric defaults aim to keep you within your free-tier limits. In this example,
240+
we recommend increasing memory and ephemeral values to allow the llama model
241+
to load correctly, therefore running this sample project will likely incur
242+
more costs than a Nitric guide using the defaults. You are responsible for
243+
staying within the limits of the free tier or any costs associated with
244+
deployment.
245+
</Note>
246+
238247
Since we'll use Nitric's default Pulumi AWS Provider make sure you're setup to deploy using that provider. You can find more information on how to set up the AWS provider in the [Nitric AWS Provider documentation](/providers/pulumi/aws).
239248
240249
<Note>

0 commit comments

Comments
 (0)