You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- A Mistral AI API Key (see the [Mistral documentation](https://docs.mistral.ai/#api-access) for more details on API
29
32
access)
30
33
31
34
# Installation
35
+
32
36
## Gradle
37
+
33
38
```groovy
34
39
repositories {
35
40
maven { url 'https://jitpack.io' }
@@ -41,6 +46,7 @@ dependencies {
41
46
```
42
47
43
48
## Maven
49
+
44
50
```xml
45
51
<repositories>
46
52
<repository>
@@ -57,10 +63,12 @@ dependencies {
57
63
```
58
64
59
65
# Usage
66
+
60
67
The MistralClient class contains all the methods needed to interact with the Mistral AI API. The following examples show
61
68
how to use the client to list available models and create chat completions.
62
69
63
70
## List Available Models
71
+
64
72
```java
65
73
// You can also put the API key in an environment variable called MISTRAL_API_KEY and remove the apiKey parameter given to the MistralClient constructor
66
74
String apiKey ="API_KEY_HERE";
@@ -92,7 +100,6 @@ mistral-embed
92
100
```
93
101
94
102
## Chat Completions
95
-
96
103
The chat completion in this example code is blocking and will wait until the whole response is generated.
97
104
See [Streaming Chat Completions](#streaming-chat-completions) for a way to stream chunks as they are being generated.
98
105
```java
@@ -247,9 +254,8 @@ Example output:
247
254
```
248
255
249
256
# Roadmap
250
-
-[ ] Add support for streaming in message completions
251
257
-[ ] Add support for Function Calls
252
-
-[ ]Figure out how Mistral handles rate limiting and create a queue system to handle it
258
+
-[ ]Handle rate limiting using some sort of queue system
0 commit comments