Skip to content
This repository was archived by the owner on May 20, 2025. It is now read-only.

Commit 3349c4b

Browse files
committed
add spend note
1 parent d6eee65 commit 3349c4b

File tree

1 file changed

+10
-0
lines changed

1 file changed

+10
-0
lines changed

docs/guides/python/serverless-llama.mdx

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -235,6 +235,16 @@ config:
235235
ephemeral-storage: 1024
236236
```
237237
238+
<Note>
239+
Nitric defaults aim to keep you within your free-tier limits. In this example,
240+
we recommend increasing memory and ephermeral values to allow the llama model
241+
to load correctly, therefore running this sample project will likely incur more
242+
costs than a Nitric guide using the defaults.
243+
244+
You are responsible for staying within the limits of the free tier or any
245+
costs associated with deployment.
246+
</Note>
247+
238248
Since we'll use Nitric's default Pulumi AWS Provider make sure you're setup to deploy using that provider. You can find more information on how to set up the AWS provider in the [Nitric AWS Provider documentation](/providers/pulumi/aws).
239249
240250
<Note>

0 commit comments

Comments
 (0)