You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -187,32 +183,30 @@ Starting from version 0.2.7, running `$ pytgpt` without any other command or opt
187
183
188
184
<details>
189
185
190
-
<summary>
191
-
192
-
### Developer Docs
193
-
186
+
<summary>
187
+
<h3>Developer Docs</h3>
194
188
</summary>
195
189
196
190
1. Generate a quick response
197
191
198
192
```python
199
-
from pytgpt.leoimportLEO
200
-
bot =LEO()
193
+
from pytgpt.phindimportPHIND
194
+
bot =PHIND()
201
195
resp = bot.chat('<Your prompt>')
202
196
print(resp)
203
-
# Output : How may I help you.
197
+
# Output : How can I assist you today?
204
198
```
205
199
206
200
2. Get back whole response
207
201
208
202
```python
209
-
from pytgpt.leoimportLEO
210
-
bot =LEO()
211
-
resp = bot.ask('<Your Prompt')
203
+
from pytgpt.phindimportPHIND
204
+
bot =PHIND()
205
+
resp = bot.chat('<Your prompt>')
212
206
print(resp)
213
207
# Output
214
208
"""
215
-
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwJ2', 'exception': None}
209
+
{'id': 'chatcmpl-gp6cwu2e5ez3ltoyti4z', 'object': 'chat.completion.chunk', 'created': 1731257890, 'model': 'phind-instant-llama-3_1-08-31-2024-checkpoint-1500', 'choices': [{'index': 0, 'delta': {'content': "Hello! I'm an AI assistant created by Phind to help with programming tasks. How can I assist you today?"}, 'finish_reason': None}]}
216
210
"""
217
211
```
218
212
@@ -223,17 +217,22 @@ Just add parameter `stream` with value `true`.
{'completion': "I'm so excited to share with.", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
253
-
254
-
{'completion': "I'm so excited to share with you the incredible ", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
255
-
256
-
{'completion': "I'm so excited to share with you the incredible experiences...", 'stop_reason': None, 'truncated': False, 'stop': None, 'model': 'llama-2-13b-chat', 'log_id': 'cmpl-3NmRt5A5Djqo2jXtXLBwxx', 'exception': None}
{'id': 'chatcmpl-icz6a4m1nbbclw9hhgol', 'object': 'chat.completion.chunk', 'created': 1731258032, 'model': 'phind-instant-llama-3_1-08-31-2024-checkpoint-1500', 'choices': [{'index': 0, 'delta': {'content': "Hello! I'm an AI assistant created by Phind to help with coding and technical tasks. How"}, 'finish_reason': None}]}
252
+
{'id': 'chatcmpl-icz6a4m1nbbclw9hhgol', 'object': 'chat.completion.chunk', 'created': 1731258032, 'model': 'phind-instant-llama-3_1-08-31-2024-checkpoint-1500', 'choices': [{'index': 0, 'delta': {'content': "Hello! I'm an AI assistant created by Phind to help with coding and technical tasks. How can I assist you today?"}, 'finish_reason': None}]}
**Version 0.7.0** introduces asynchronous implementation to almost all providers except a few such as *perplexity & gemini*, which relies on other libraries which lacks such implementation.
350
+
**Version 0.7.0** introduces asynchronous implementation to almost all providers except a few such as *perplexity*, which relies on other libraries which lacks such implementation.
370
351
371
352
To make it easier, you just have to prefix `Async` to the common synchronous class name. For instance `OPENGPT` will be accessed as `AsyncOPENGPT`:
372
353
@@ -423,8 +404,8 @@ To obtain more tailored responses, consider utilizing [optimizers](pytgpt/utils.
0 commit comments