Demonstrates a prompt entered into a React frontend component and a middleware node.js/Fastify server calling python script. Python constructs an AI client, uses chat completions to send a canned prompt to Azure OpenAI, then sends what it gets from the LLM back to the server.
The prompt sent to the LLM is canned. The reply from the LLM is captured and displayed in realtime.
Demonstrates a prompt entered into a React frontend component, then a middleware node.js/Fastify server calling python script and returning with an error. In the 12 hours post screenshot (above), my Azure credits expired so access to the LLM backend is revoked.
This is a 3-tiered web application: frontend, middleware & backend, all served under Kubernetes. It comes with a menu driver that cleanly performs the labeled action.
- The frontend is a React web application.
- The middleware is a node.js layer for logic & data access.
- The backend is a postgres database.
- kubectl
- docker
- openssl (for self-signed certs)
git clone git@github.com:<YOUR-SSH-USER>/product-catalog
cd ./product-catalog
. ./menu.sh <<<3