You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+16-14Lines changed: 16 additions & 14 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -40,14 +40,19 @@ Modern LLMs are already strong at **producing SQL queries without finetuning**.
40
40
We therefore recommend that most users:
41
41
42
42
1.**Run inference** directly on the full benchmark:
43
-
- Use [`llmsql.LLMSQLVLLMInference`](./llmsql/inference/inference.py) (the main inference class) for generation of SQL predictions with your LLM from HF.
- Use [`llmsql.inference_transformers`](./llmsql/inference/inference_transformers.py) (the function for transformers inference) for generation of SQL predictions with your model. If you want to do vllm based inference, use [`llmsql.inference_vllm`](./llmsql/inference/inference_vllm.py). Works both with HF model id, e.g. `Qwen/Qwen2.5-1.5B-Instruct` and model instance passed directly, e.g. `inference_transformers(model_or_model_name_or_path=model, ...)`
44
46
- Evaluate results against the benchmark with the [`llmsql.LLMSQLEvaluator`](./llmsql/evaluation/evaluator.py) evaluator class.
45
47
46
48
2.**Optional finetuning**:
47
49
- For research or domain adaptation, we provide finetuning script for HF models. Use `llmsql finetune --help` or read [Finetune Readme](./llmsql/finetune/README.md) to find more about finetuning.
48
50
49
51
> [!Tip]
50
52
> You can find additional manuals in the README files of each folder([Inferece Readme](./llmsql/inference/README.md), [Evaluation Readme](./llmsql/evaluation/README.md), [Finetune Readme](./llmsql/finetune/README.md))
53
+
54
+
> [!Tip]
55
+
> vllm based inference require vllm optional dependency group installed: `pip install llmsql[vllm]`
51
56
---
52
57
53
58
## Repository Structure
@@ -77,24 +82,21 @@ pip3 install llmsql
77
82
### 1. Run Inference
78
83
79
84
```python
80
-
from llmsql importLLMSQLVLLMInference
85
+
from llmsql importinference_transformers
81
86
82
-
# Initialize inference engine
83
-
inference = LLMSQLVLLMInference(
84
-
model_name="Qwen/Qwen2.5-1.5B-Instruct", # or any Hugging Face causal LM
0 commit comments