Skip to content

Commit 11b32f6

Browse files
committed
deploy: be66807
1 parent ff30eac commit 11b32f6

File tree

2 files changed

+4
-4
lines changed

2 files changed

+4
-4
lines changed

index.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -198,9 +198,9 @@ <h2 id="-overview"><a class="header" href="#-overview">🎯 Overview</a></h2>
198198
<p>KTransformers is a research project focused on efficient inference and fine-tuning of large language models through CPU-GPU heterogeneous computing. The project has evolved into <strong>two core modules</strong>: <a href="https://github.com/kvcache-ai/ktransformers/tree/main/kt-kernel/">kt-kernel</a> and <a href="https://github.com/kvcache-ai/ktransformers/tree/main/kt-sft">kt-sft</a>.</p>
199199
<h2 id="-updates"><a class="header" href="#-updates">🔥 Updates</a></h2>
200200
<ul>
201-
<li><strong>Dec 24, 2025</strong>: Support Native MiniMax-M2.1 inference. (<a href="./doc/en/MiniMax-M2.1-Tutorial.html">Tutorial</a>)</li>
201+
<li><strong>Dec 24, 2025</strong>: Support Native MiniMax-M2.1 inference. (<a href="./doc/en/kt-kernel/MiniMax-M2.1-Tutorial.html">Tutorial</a>)</li>
202202
<li><strong>Dec 22, 2025</strong>: Support RL-DPO fine-tuning with LLaMA-Factory. (<a href="./doc/en/SFT/DPO_tutorial.html">Tutorial</a>)</li>
203-
<li><strong>Dec 5, 2025</strong>: Support Native Kimi-K2-Thinking inference (<a href="./doc/en/Kimi-K2-Thinking-Native.html">Tutorial</a>)</li>
203+
<li><strong>Dec 5, 2025</strong>: Support Native Kimi-K2-Thinking inference (<a href="./doc/en/kt-kernel/Kimi-K2-Thinking-Native.html">Tutorial</a>)</li>
204204
<li><strong>Nov 6, 2025</strong>: Support Kimi-K2-Thinking inference (<a href="./doc/en/Kimi-K2-Thinking.html">Tutorial</a>) and fine-tune (<a href="./doc/en/SFT_Installation_Guide_KimiK2.html">Tutorial</a>)</li>
205205
<li><strong>Nov 4, 2025</strong>: KTransformers Fine-Tuning × LLaMA-Factory Integration. (<a href="./doc/en/KTransformers-Fine-Tuning_User-Guide.html">Tutorial</a>)</li>
206206
<li><strong>Oct 27, 2025</strong>: Support Ascend NPU. (<a href="./doc/zh/DeepseekR1_V3_tutorial_zh_for_Ascend_NPU.html">Tutorial</a>)</li>

print.html

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -197,9 +197,9 @@ <h2 id="-overview"><a class="header" href="#-overview">🎯 Overview</a></h2>
197197
<p>KTransformers is a research project focused on efficient inference and fine-tuning of large language models through CPU-GPU heterogeneous computing. The project has evolved into <strong>two core modules</strong>: <a href="https://github.com/kvcache-ai/ktransformers/tree/main/kt-kernel/">kt-kernel</a> and <a href="https://github.com/kvcache-ai/ktransformers/tree/main/kt-sft">kt-sft</a>.</p>
198198
<h2 id="-updates"><a class="header" href="#-updates">🔥 Updates</a></h2>
199199
<ul>
200-
<li><strong>Dec 24, 2025</strong>: Support Native MiniMax-M2.1 inference. (<a href="doc/en/MiniMax-M2.1-Tutorial.html">Tutorial</a>)</li>
200+
<li><strong>Dec 24, 2025</strong>: Support Native MiniMax-M2.1 inference. (<a href="doc/en/kt-kernel/MiniMax-M2.1-Tutorial.html">Tutorial</a>)</li>
201201
<li><strong>Dec 22, 2025</strong>: Support RL-DPO fine-tuning with LLaMA-Factory. (<a href="doc/en/SFT/DPO_tutorial.html">Tutorial</a>)</li>
202-
<li><strong>Dec 5, 2025</strong>: Support Native Kimi-K2-Thinking inference (<a href="doc/en/Kimi-K2-Thinking-Native.html">Tutorial</a>)</li>
202+
<li><strong>Dec 5, 2025</strong>: Support Native Kimi-K2-Thinking inference (<a href="doc/en/kt-kernel/Kimi-K2-Thinking-Native.html">Tutorial</a>)</li>
203203
<li><strong>Nov 6, 2025</strong>: Support Kimi-K2-Thinking inference (<a href="doc/en/Kimi-K2-Thinking.html">Tutorial</a>) and fine-tune (<a href="doc/en/SFT_Installation_Guide_KimiK2.html">Tutorial</a>)</li>
204204
<li><strong>Nov 4, 2025</strong>: KTransformers Fine-Tuning × LLaMA-Factory Integration. (<a href="doc/en/KTransformers-Fine-Tuning_User-Guide.html">Tutorial</a>)</li>
205205
<li><strong>Oct 27, 2025</strong>: Support Ascend NPU. (<a href="doc/zh/DeepseekR1_V3_tutorial_zh_for_Ascend_NPU.html">Tutorial</a>)</li>

0 commit comments

Comments
 (0)