Skip to content

Commit 052c968

Browse files
authored
fix: update tokenizer info (#20)
1 parent b3e8a5e commit 052c968

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

repoqa.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -220,8 +220,8 @@ <h2 id="task-snf" class="text-nowrap mt-5">
220220
<h3 class="text-nowrap mt-5">🏆 Benchmark @ 16K Code Context</h3>
221221
<p>
222222
🛠️ <b>Config:</b> The code in the prompt is fixed to 16K tokens (by
223-
DeepSeekCoder tokenizer). Yet, the required context is a bit larger
224-
than 16K so we extend 8K and 16K models using either
223+
CodeLlama tokenizer). Yet, the required context is a bit larger than
224+
16K so we extend 8K and 16K models using either
225225
<a
226226
href="https://www.reddit.com/r/LocalLLaMA/comments/14mrgpr/dynamically_scaled_rope_further_increases/"
227227
>Dynamic RoPE Scaling</a

0 commit comments

Comments
 (0)