does anyone know how to keep the formatting of prompts consistent across different topic? #3217
-
e.g. i would like to generate wikipedia style article. how to prompt for consistency across all topic / titles? below is my prompt and i know the seed value but i can never find the right seed to generate article like wikipedia. does anyone know what trick i can get wikipedia like comprehensive and detailed article? e.g. more than 2400 words too etc.
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Everything else aside:
That can't work. Since |
Beta Was this translation helpful? Give feedback.
Everything else aside:
That can't work.
-c
sets a context size of 3,620 tokens. Both the prompt and any generated tokens need to fit in that (unless you're using --keep but I wouldn't really recommend it and even when it works you're not likely to get coherent output too far past double the context size).Since
-c 3620
,-n
needs to be a lower value. Your model is LLaMA 2 so it probably supports up to-c 4096
. You can also possibly look into using RoPE tricks to be able to set-c
to a higher value but expecting 12k tokens worth of coherent output from a 7B model is pretty optimistic.