Skip to content

Commit c7a194d

Browse files
Prithivi DaPrithivi Da
authored andcommitted
intial commit
1 parent fdd60e6 commit c7a194d

File tree

3 files changed

+22
-9
lines changed

3 files changed

+22
-9
lines changed

README.md

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -380,6 +380,7 @@ Below are the benchmarks, and yes the numbers are reproducable. Detailed tables
380380

381381
### I have a question
382382

383+
- Check FAQs, if it's not addressed:
383384
- Please head to the discussions tab and ask your question.
384385

385386
### Scope, Caveats and Limitations
@@ -409,4 +410,15 @@ Below are the benchmarks, and yes the numbers are reproducable. Detailed tables
409410

410411
- Will be added shortly
411412

413+
### FAQs
414+
415+
- If you got "connection refused" exception for Local LLMs + Local machine CLI for synthetic data make sure docker daemon is running. If everything goes smoothly you should see something like below.
416+
417+
```python
418+
2024-11-24 09:43:35,708 - INFO - Local LLM server is not running.
419+
2024-11-24 09:43:35,708 - INFO - 1st time setup might take a few minutes. Please be patient...
420+
2024-11-24 09:43:40,685 - INFO - Attempting to start LLM server via Docker...
421+
a2ddd96346fea535759c47681fdd7164618e2aa27f0565444e508b82a26a01f8
422+
```
423+
412424
</details>

src/route0x/route_builder/route_builder.py

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1401,4 +1401,3 @@ def build_routes(self):
14011401
self.logger.info("Model training and evaluation completed.")
14021402
self.logger.info("Thank you for using route0x! May all your queries find their way.")
14031403

1404-

src/route0x/route_builder/unified_llm_caller.py

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,17 +23,15 @@ def __init__(self, provider, model, ollama_server_url='http://localhost:11434',
2323
openai.api_key = self.api_token
2424

2525
elif self.provider == 'claude':
26-
self.client = Anthropic(api_key=self.api_token)
26+
# self.client = Anthropic(api_key=self.api_token)
27+
raise NotImplementedError("Anthropic support is not yet implemented.")
2728

2829
elif self.provider == 'google':
29-
pass
30+
raise NotImplementedError("Google support is not yet implemented.")
3031

3132
elif self.provider == 'ollama':
3233
self.client = Client(host=ollama_server_url)
33-
ollama_system_prompt = f"""FROM {model}
34-
SYSTEM {system_prompt}"""
35-
ollama.create(model=model, modelfile=ollama_system_prompt)
36-
34+
3735
else:
3836
raise ValueError(f"Unsupported provider: {self.provider}")
3937

@@ -77,10 +75,14 @@ def _generate_claude(self, prompt):
7775

7876
def _generate_ollama(self, prompt):
7977
response = self.client.chat(model=self.model, messages=[
78+
{
79+
'role': 'system',
80+
'content': self.system_prompt
81+
},
8082
{
8183
'role': 'user',
82-
'content': prompt,
84+
'content': prompt
8385
},
84-
])
86+
])
8587

8688
return response["message"]["content"]

0 commit comments

Comments
 (0)