Replies: 1 comment 1 reply
-
Hey @Pemcode, Your question is a bit generic, but I have tried to answer all your ques one by one.
No, it's not. You can deploy it on your local machine as long as your machine can run it.
( To set up Kubernetes and the NVIDIA Cloud Native Stack locally)
If you are only using your local machine, you typically do not need to modify the Hope it helps. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello everyone!
First of all, thank you for your work! :)
I would like to try testing the solution available at this address: https://catalog.ngc.nvidia.com/orgs/nvidia/teams/aiworkflows/helm-charts/rag-app-multimodal-chatbot
However, I've been stuck for three days trying to install the Cloud Native Stack locally.
I want to test it on my laptop (16GB RAM, RTX 4070, 16-core AMD Ryzen 7), but I always get stuck at some point during the installation of Kubernetes/containerd or anything else.
Is it an obligation to be connected at a NVidia VMI ?
Do you have a quick tutorial to follow the steps in order ?
Do I need to have a "hosts" file if I'm only using my local machine (running Ubuntu 22.04)?
Thank you in advance for your help.
Wishing the entire team a great day!
Beta Was this translation helpful? Give feedback.
All reactions