Skip to content

Commit 3c07be6

Browse files
committed
fix: typos
1 parent 4f468a6 commit 3c07be6

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/cli/recommendedModels.ts

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -62,7 +62,7 @@ export const recommendedModels: ModelRecommendation[] = [{
6262
abilities: ["chat", "complete", "functionCalling", "reasoning"],
6363
description: "Qwen model was created by Alibaba and is using chain of though (CoT) to reason across a wide variety of topics.\n" +
6464
"It's optimized for an assistant-like chat use cases, with native support for function calling.\n" +
65-
"This model is censored, but its responses quality on many topics is extremely high compared to its size.\n" +
65+
"This model is censored, but its responses quality on many topics is very high compared to its small size.\n" +
6666
"This is the 0.6B billion parameters version of the model and is suitable for very simple tasks and can run on very resource-constraint hardware.\n",
6767

6868
fileOptions: [
@@ -150,7 +150,7 @@ export const recommendedModels: ModelRecommendation[] = [{
150150
"It's optimized for an assistant-like chat use cases, with native support for function calling.\n" +
151151
"This version of the model utilizes a Mixture of Experts architecture, with only 3B active parameters, thus making it very fast.\n" +
152152
"Mixtures of Experts (MoE) is a technique where different models, each skilled in solving a particular kind of problem, work together to the improve the overall performance on complex tasks.\n" +
153-
"This model is censored, but its responses quality on many topics is extremely high.\n" +
153+
"This model is censored, but its responses quality on many topics is high compared to its high generation speed.\n" +
154154
"This is the 30 billion parameters Mixtures of Experts (MoE) version of the model.\n" +
155155
"Its performance is comparable and even surpasses DeepSeek V3 and GPT-4o.",
156156

0 commit comments

Comments
 (0)