Replies: 1 comment
-
Self answer: #5076 |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I read some threads here mentioning KL divergence is more appropriate than perplexity, so I decided to do it by my hand. But when I tried to run
perplexity
with--kl-divergence
option, I faced below error message.I have no knowledge of KL divergence. Please teach me how to get or generate the file with probabilities. What I tried to do is to measure KL divergence for various quant types of llama.cpp on
meta-llama/Llama-2-7b-chat-hf
model.Beta Was this translation helpful? Give feedback.
All reactions