Skip to content

Commit 7994a30

Browse files
committed
chore: trunk upgrade and language fixes
1 parent cd6418c commit 7994a30

File tree

4 files changed

+18
-12
lines changed

4 files changed

+18
-12
lines changed

.trunk/trunk.yaml

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -18,11 +18,11 @@ runtimes:
1818

1919
lint:
2020
enabled:
21-
- renovate@39.109.0
21+
- renovate@39.128.0
2222
23-
24-
25-
23+
24+
25+
2626
- git-diff-check
2727
2828

modus/app-manifest.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,7 @@ Each connection has a `type` property, which controls how it's used and which
106106
additional properties are available. The following table lists the available
107107
connection types:
108108

109-
| Type | Purpose | Functions Namespaces |
109+
| Type | Purpose | Function Classes |
110110
| :----------- | :------------------------------- | :-------------------------- |
111111
| `http` | Connect to an HTTP(S) web server | `http`, `graphql`, `models` |
112112
| `postgresql` | Connect to a PostgreSQL database | `postgresql` |

modus/deepseek-model.mdx

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,17 +6,18 @@ mode: "wide"
66
---
77

88
DeepSeek is an AI lab that has developed and released a series of open source
9-
LLMs that are notable for both their performance and cost-efficiency. By using a
10-
Mixture-of-Experts (MoE) system that utilizes only 37 billion of the models' 671
11-
billion parameters for any task, the DeepSeek-R1 model is able to achieve best
12-
in class performance at a fraction of cost of inference on other comparable
13-
models. In this guide we review how to leverage the DeepSeek models using Modus.
9+
large language models (LLM) that are notable for both their performance and
10+
cost-efficiency. By using a Mixture-of-Experts (MoE) system that utilizes only
11+
37 billion of the models' 671 billion parameters for any task, the DeepSeek-R1
12+
model is able to achieve best in class performance at a fraction of cost of
13+
inference on other comparable models. In this guide we review how to leverage
14+
the DeepSeek models using Modus.
1415

1516
## Options for using DeepSeek with Modus
1617

1718
There are two options for invoking DeepSeek models in your Modus app:
1819

19-
1. [Use the distilled DeepSeek model hosted by Hypermode](#using-the-distilled-deepseek-model-hosted-by-Hypermode)
20+
1. [Use the distilled DeepSeek model hosted by Hypermode](#using-the-distilled-deepseek-model-hosted-by-hypermode)
2021
Hypermode hosts and makes available the distilled DeepSeek model which can be
2122
used by Modus apps developed locally and deployed to Hypermode
2223
2. [Use the DeepSeek API with your Modus app](#using-the-deepseek-api-with-modus)

styles/config/vocabularies/general/accept.txt

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@ inferencing
3636
LLM
3737
[Mm]odus
3838
namespace
39+
namespaces
3940
nnClassify
4041
npm
4142
NQuads
@@ -56,4 +57,8 @@ upsert
5657
URL|url
5758
urql
5859
UUID
59-
[Dd]eserialize
60+
[Dd]eserialize
61+
upsertBatch
62+
computeDistance
63+
getNamespaces
64+
timeLog

0 commit comments

Comments
 (0)