Skip to content

Commit 03fdea3

Browse files
committed
feat: add HIF format support for hypergraph export and import
1 parent 4f18493 commit 03fdea3

File tree

5 files changed

+451
-42
lines changed

5 files changed

+451
-42
lines changed

README.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -304,6 +304,12 @@ hg.save("my_hypergraph.hgdb")
304304
hg2 = HypergraphDB(storage_file="my_hypergraph.hgdb")
305305
print(hg2.all_v) # Output: {1, 2, 4, 5, 6, 7, 8, 9, 10}
306306
print(hg2.all_e) # Output: {(4, 5, 7, 9), (9, 10), (1, 2, 7), (1, 2), (2, 6, 9), (1, 4, 6), (2, 5, 6)}
307+
308+
# Or save in HIF format
309+
hg.save_as_hif("my_hypergraph.hif.json")
310+
311+
# Load the hypergraph from a HIF file
312+
hg.load_from_hif("my_hypergraph.hif.json")
307313
```
308314

309315
#### **7. 🎨 Interactive Visualization**

docs/api/index.md

Lines changed: 29 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -62,11 +62,15 @@ The foundational base class that defines the core hypergraph structure and basic
6262

6363
### Persistence Operations
6464

65-
| Method | Description |
66-
|--------|-------------|
67-
| `save(filepath)` | Save hypergraph to file |
68-
| `load(filepath)` | Load hypergraph from file |
69-
| `copy()` | Create a deep copy of the hypergraph |
65+
| Method | Description |
66+
| ------------------------- | -------------------------------------------------- |
67+
| `save(filepath)` | Save hypergraph to file |
68+
| `load(filepath)` | Load hypergraph from file |
69+
| `to_hif()` | Export to HIF (Hypergraph Interchange Format) JSON |
70+
| `save_as_hif(filepath)` | Save hypergraph as HIF format JSON file |
71+
| `from_hif(hif_data)` | Load hypergraph from HIF format data |
72+
| `load_from_hif(filepath)` | Load hypergraph from HIF format JSON file |
73+
| `copy()` | Create a deep copy of the hypergraph |
7074

7175
### Visualization
7276

@@ -136,6 +140,26 @@ hg.add_e(("person1", "person2", "person3"), {
136140
})
137141
```
138142

143+
### HIF Format Import/Export
144+
145+
Hypergraph-DB supports HIF (Hypergraph Interchange Format) for standardized hypergraph data exchange.
146+
147+
#### Export to HIF Format
148+
149+
```python
150+
# Export and save to file
151+
hg.to_hif("my_hypergraph.hif.json")
152+
153+
# Or use save_as_hif method
154+
hg.save_as_hif("my_hypergraph.hif.json")
155+
```
156+
157+
#### Import from HIF Format
158+
159+
```python
160+
hg.load_from_hif("my_hypergraph.hif.json")
161+
```
162+
139163
## Error Handling
140164

141165
The API includes comprehensive error handling:

docs/api/index.zh.md

Lines changed: 61 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -39,47 +39,51 @@ class CustomHypergraphDB(BaseHypergraphDB):
3939

4040
### 基础操作
4141

42-
| 方法 | 描述 | 示例 |
43-
|------|------|------|
44-
| `add_v(id, data)` | 添加顶点 | `hg.add_v("A", {"name": "Alice"})` |
45-
| `add_e(tuple, data)` | 添加超边 | `hg.add_e(("A", "B"), {"type": "friend"})` |
46-
| `remove_v(id)` | 移除顶点 | `hg.remove_v("A")` |
47-
| `remove_e(tuple)` | 移除超边 | `hg.remove_e(("A", "B"))` |
48-
| `v(id)` | 获取顶点数据 | `data = hg.v("A")` |
49-
| `e(tuple)` | 获取超边数据 | `data = hg.e(("A", "B"))` |
42+
| 方法 | 描述 | 示例 |
43+
| -------------------- | ------------ | ------------------------------------------ |
44+
| `add_v(id, data)` | 添加顶点 | `hg.add_v("A", {"name": "Alice"})` |
45+
| `add_e(tuple, data)` | 添加超边 | `hg.add_e(("A", "B"), {"type": "friend"})` |
46+
| `remove_v(id)` | 移除顶点 | `hg.remove_v("A")` |
47+
| `remove_e(tuple)` | 移除超边 | `hg.remove_e(("A", "B"))` |
48+
| `v(id)` | 获取顶点数据 | `data = hg.v("A")` |
49+
| `e(tuple)` | 获取超边数据 | `data = hg.e(("A", "B"))` |
5050

5151
### 查询操作
5252

53-
| 方法 | 描述 | 示例 |
54-
|------|------|------|
55-
| `has_v(id)` | 检查顶点是否存在 | `hg.has_v("A")` |
56-
| `has_e(tuple)` | 检查超边是否存在 | `hg.has_e(("A", "B"))` |
57-
| `degree_v(id)` | 顶点度数 | `deg = hg.degree_v("A")` |
58-
| `degree_e(tuple)` | 超边大小 | `size = hg.degree_e(("A", "B"))` |
59-
| `nbr_v(id)` | 顶点的邻居顶点 | `neighbors = hg.nbr_v("A")` |
60-
| `nbr_e_of_v(id)` | 顶点的邻居超边 | `edges = hg.nbr_e_of_v("A")` |
61-
| `nbr_v_of_e(tuple)` | 超边的邻居顶点 | `vertices = hg.nbr_v_of_e(("A", "B"))` |
53+
| 方法 | 描述 | 示例 |
54+
| ------------------- | ---------------- | -------------------------------------- |
55+
| `has_v(id)` | 检查顶点是否存在 | `hg.has_v("A")` |
56+
| `has_e(tuple)` | 检查超边是否存在 | `hg.has_e(("A", "B"))` |
57+
| `degree_v(id)` | 顶点度数 | `deg = hg.degree_v("A")` |
58+
| `degree_e(tuple)` | 超边大小 | `size = hg.degree_e(("A", "B"))` |
59+
| `nbr_v(id)` | 顶点的邻居顶点 | `neighbors = hg.nbr_v("A")` |
60+
| `nbr_e_of_v(id)` | 顶点的邻居超边 | `edges = hg.nbr_e_of_v("A")` |
61+
| `nbr_v_of_e(tuple)` | 超边的邻居顶点 | `vertices = hg.nbr_v_of_e(("A", "B"))` |
6262

6363
### 全局属性
6464

65-
| 属性 | 描述 | 示例 |
66-
|------|------|------|
65+
| 属性 | 描述 | 示例 |
66+
| ------- | -------- | --------------------- |
6767
| `all_v` | 所有顶点 | `vertices = hg.all_v` |
68-
| `all_e` | 所有超边 | `edges = hg.all_e` |
69-
| `num_v` | 顶点数量 | `count = hg.num_v` |
70-
| `num_e` | 超边数量 | `count = hg.num_e` |
68+
| `all_e` | 所有超边 | `edges = hg.all_e` |
69+
| `num_v` | 顶点数量 | `count = hg.num_v` |
70+
| `num_e` | 超边数量 | `count = hg.num_e` |
7171

7272
### 持久化操作
7373

74-
| 方法 | 描述 | 示例 |
75-
|------|------|------|
76-
| `save(path)` | 保存到文件 | `hg.save("graph.hgdb")` |
77-
| `load(path)` | 从文件加载 | `hg.load("graph.hgdb")` |
74+
| 方法 | 描述 | 示例 |
75+
| ------------------------- | ------------------- | --------------------------------------------------------- |
76+
| `save(path)` | 保存到文件 | `hg.save("graph.hgdb")` |
77+
| `load(path)` | 从文件加载 | `hg.load("graph.hgdb")` |
78+
| `to_hif(filepath=None)` | 导出为 HIF 格式 | `hif_data = hg.to_hif()``hg.to_hif("graph.hif.json")` |
79+
| `save_as_hif(filepath)` | 保存为 HIF 格式文件 | `hg.save_as_hif("graph.hif.json")` |
80+
| `from_hif(hif_data)` | 从 HIF 格式数据加载 | `hg.from_hif(hif_data)``hg.from_hif(json_string)` |
81+
| `load_from_hif(filepath)` | 从 HIF 格式文件加载 | `hg.load_from_hif("graph.hif.json")` |
7882

7983
### 可视化
8084

81-
| 方法 | 描述 | 示例 |
82-
|------|------|------|
85+
| 方法 | 描述 | 示例 |
86+
| -------------------------- | ---------- | -------------------- |
8387
| `draw(port, open_browser)` | 启动可视化 | `hg.draw(port=8080)` |
8488

8589
[查看完整可视化 API →](visualization.zh.md)
@@ -144,6 +148,26 @@ updated_user = hg.v("user1")
144148
updated_edge = hg.e(("user1", "user2"))
145149
```
146150

151+
### HIF 格式导入导出
152+
153+
Hypergraph-DB 支持 HIF (Hypergraph Interchange Format) 格式,用于标准化的超图数据交换。
154+
155+
#### 导出到 HIF 格式
156+
157+
```python
158+
# 导出并保存到文件
159+
hg.to_hif("my_hypergraph.hif.json")
160+
161+
# 或者使用 save_as_hif 方法
162+
hg.save_as_hif("my_hypergraph.hif.json")
163+
```
164+
165+
#### 从 HIF 格式导入
166+
167+
```python
168+
hg.load_from_hif("my_hypergraph.hif.json")
169+
```
170+
147171
## 错误处理
148172

149173
### 常见异常
@@ -152,16 +176,16 @@ updated_edge = hg.e(("user1", "user2"))
152176
try:
153177
# 尝试添加顶点
154178
hg.add_v("user1", {"name": "张三"})
155-
179+
156180
# 尝试添加超边(顶点必须已存在)
157181
hg.add_e(("user1", "user999"), {"type": "朋友"})
158-
182+
159183
except AssertionError as e:
160184
print(f"断言错误: {e}")
161-
185+
162186
except KeyError as e:
163187
print(f"键错误: {e}")
164-
188+
165189
except Exception as e:
166190
print(f"其他错误: {e}")
167191
```
@@ -199,24 +223,24 @@ from hyperdb import HypergraphDB
199223

200224
class AnalyticsHypergraphDB(HypergraphDB):
201225
"""扩展了分析功能的超图数据库"""
202-
226+
203227
def clustering_coefficient(self, vertex_id: str) -> float:
204228
"""计算顶点的聚类系数"""
205229
neighbors = self.nbr_v(vertex_id)
206230
if len(neighbors) < 2:
207231
return 0.0
208-
232+
209233
# 计算邻居之间的连接
210234
connections = 0
211235
total_possible = len(neighbors) * (len(neighbors) - 1) // 2
212-
236+
213237
for edge in self.all_e:
214238
edge_vertices = self.nbr_v_of_e(edge)
215239
if len(edge_vertices.intersection(neighbors)) >= 2:
216240
connections += 1
217-
241+
218242
return connections / total_possible if total_possible > 0 else 0.0
219-
243+
220244
def k_core_decomposition(self, k: int) -> Set[str]:
221245
"""k-核分解:找出度数至少为k的顶点"""
222246
return {v for v in self.all_v if self.degree_v(v) >= k}

0 commit comments

Comments
 (0)