You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It begins by constructing a fine-grained knowledge graph from the source text,then identifies knowledge gaps in LLMs using the expected calibration error metric, prioritizing the generation of QA pairs that target high-value, long-tail knowledge.
59
60
Furthermore, GraphGen incorporates multi-hop neighborhood sampling to capture complex relational information and employs style-controlled generation to diversify the resulting QA data.
@@ -77,6 +78,32 @@ After data generation, you can use [LLaMA-Factory](https://github.com/hiyouga/LL
77
78
</details>
78
79
79
80
81
+
## ⚙️ Support List
82
+
83
+
We support various LLM inference servers, API servers, inference clients, input file formats, data modalities, output data formats, and output data types:
84
+
85
+
| Inference Server | Api Server | Inference Client | Input File Format | Data Modal | Output Data Format | Output Data Type |
Experience GraphGen through [Web](https://g-app-center-120612-6433-jpdvmvp.openxlab.space) or [Backup Web Entrance](https://openxlab.org.cn/apps/detail/chenzihonga/GraphGen)
@@ -177,7 +204,7 @@ For any questions, please check [FAQ](https://github.com/open-sciencelab/GraphGe
177
204
Pick the desired format and run the matching script:
0 commit comments