This repository was archived by the owner on Dec 9, 2024. It is now read-only.
Add support to monitor the inference pipeline using Comet ML's LLMOps feature #32
iusztinpaul
announced in
Announcements
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Be careful to set Comet credentials in the
.envfile:After, you can run the inference pipeline (aka financial bot) as usual.
NOTE: In debug/dev mode, it won't log the prompts.
Beta Was this translation helpful? Give feedback.
All reactions