@@ -39,23 +39,61 @@ Running `npx get-dtos` without any arguments will display the available options:
3939 --verbose Display verbose logging
4040 --ignore-ssl-errors Ignore SSL Errors
4141
42- ### Reusable DTOs and Reusable Clients in any language
42+ ## Reusable DTOs and Reusable Clients in any language
4343
4444A benefit of [ Add ServiceStack Reference] ( https://docs.servicestack.net/add-servicestack-reference ) is that only an
4545API DTOs need to be generated which can then be used to call any remote instance running that API. E.g. DTOs generated
4646for our deployed AI Server instance at [ openai.servicestack.net] ( https://openai.servicestack.net ) can be used to call
4747any self-hosted AI Server instance, likewise the same generic client can also be used to call any other ServiceStack API.
4848
49+ ### Typed Open AI Chat & Ollama APIs in 11 Languages
50+
51+ A good example of its versatility is in the [ Typed OpenAI Chat & Ollama APIs] ( /posts/typed-openai-chat-ollama-apis )
52+ in which AI Server's Typed DTOs can be used to call ** any Open AI Chat compatible API** in its 11 supported languages.
53+
4954### TypeScript Example
5055
51- For example you can get the TypeScript DTOs for the just released [ AI Server] ( /posts/ai-server ) with :
56+ For example you can get the TypeScript DTOs for the just released [ AI Server] ( /posts/ai-server ) by :
5257
53- ::: sh
58+ 1 . Installing the ` @servicestack/client ` npm package:
59+
60+ ::: copy
61+ npm install @servicestack/client
62+ :::
63+
64+ 2 . Download AI Server's TypeScript DTOs:
65+
66+ ::: copy
5467` npx get-dtos typescript https://openai.servicestack.net `
5568:::
5669
5770Which just like the ` x ` tool will add the TypeScript DTOs to the ` dtos.ts ` file
5871
72+ ### Calling Ollama from TypeScript
73+
74+ Call Ollama by sending ` OpenAiChatCompletion ` Request DTO with JsonServiceClient:
75+
76+ ``` ts
77+ import { JsonServiceClient } from " @servicestack/client"
78+ import { OpenAiChatCompletion } from " ./dtos"
79+
80+ const client = new JsonServiceClient (baseUrl )
81+
82+ const response = await client .postToUrl (" /v1/chat/completions" ,
83+ new OpenAiChatCompletion ({
84+ model: " mixtral:8x22b" ,
85+ messages: [
86+ { role: " user" , content: " What's the capital of France?" }
87+ ],
88+ max_tokens: 50
89+ })
90+ )
91+
92+ const answer = response .choices [0 ].message .content
93+ ```
94+
95+ ### Update TypeScript DTOs
96+
5997And later update all TypeScript ServiceStack References in the current directory with:
6098
6199::: sh
0 commit comments