Skip to content

Commit b6de220

Browse files
authored
Add files via upload
1 parent b96e1a3 commit b6de220

File tree

1 file changed

+170
-0
lines changed

1 file changed

+170
-0
lines changed

source/_posts/dp_v3.md

Lines changed: 170 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,170 @@
1+
---
2+
title: "DeePMD-kit v3 Official Release: Multi-Backend Support, DPA-2 Large Model, and Plugin Mechanism"
3+
date: 2024-11-23
4+
categories:
5+
- DeePMD-kit
6+
mathjax: true
7+
---
8+
9+
## 1. Multi-backend framework: Powered by TensorFlow, PyTorch, and JAX
10+
11+
<center><img src=https://dp-public.oss-cn-beijing.aliyuncs.com/community/dpv3/dp1.png# pic_center width="100%" height="100%" /></center>
12+
13+
DeePMD-kit v3 implements a flexible and pluggable backend framework, providing a consistent training and inference experience across multiple backends. Version 3.0.0 includes the following backends:
14+
15+
- **TensorFlow Backend**: Static graph for efficient computation.
16+
- **PyTorch Backend**: Dynamic graph, simplifying model extension and development.
17+
- **DP Backend**: Reference backend implemented with NumPy and Array API.
18+
- **JAX Backend**: Static graph + JIT, based on the DP backend and Array API.
19+
20+
| Function | TensorFlow | PyTorch | JAX | DP |
21+
|------------------------------|------------|---------|------|------|
22+
| Local frame descriptor |||||
23+
| se_e2_a descriptor |||||
24+
| se_e2_r descriptor |||||
25+
| se_e3 descriptor |||||
26+
| se_e3_tebd descriptor |||||
27+
| DPA-1 descriptor |||||
28+
| DPA-2 descriptor |||||
29+
| Hybrid descriptor |||||
30+
| Fit energy |||||
31+
| Fit dipole |||||
32+
| Fit polar |||||
33+
| Fit DOS |||||
34+
| Fit properties |||||
35+
| ZBL |||||
36+
| DPLR |||||
37+
| DPRc |||||
38+
| Spin |||||
39+
| Ladder calculation |||||
40+
| Model training |||||
41+
| Model compression |||||
42+
| Python inference |||||
43+
| C++ inference |||||
44+
45+
46+
The main features of the multi-backend framework include:
47+
- Models can be trained using the same training data and input scripts across different backends, allowing users to switch backends based on efficiency or convenience requirements.
48+
```markdown
49+
# Training a model using the TensorFlow backend
50+
dp --tf train input.json
51+
dp --tf freeze
52+
dp --tf compress
53+
54+
# Training a model using the PyTorch backend
55+
dp --pt train input.json
56+
dp --pt freeze
57+
dp --pt compress
58+
```
59+
60+
- Use `dp convert-backend` to convert models between different backends. It supports backend-specific file extensions (e.g., TensorFlow uses `.pb`, PyTorch uses `.pth`).
61+
62+
```markdown
63+
# Convert a TensorFlow model to a PyTorch model
64+
dp convert-backend frozen_model.pb frozen_model.pth
65+
66+
# Convert a PyTorch model to a TensorFlow model
67+
dp convert-backend frozen_model.pth frozen_model.pb
68+
69+
# Convert a PyTorch model to a JAX model
70+
dp convert-backend frozen_model.pth frozen_model.savedmodel
71+
72+
# Convert a PyTorch model to a backend-independent DP format
73+
dp convert-backend frozen_model.pth frozen_model.dp
74+
```
75+
76+
- Inference with different backends can be performed using the `dp test` interface, Python/C++/C interfaces, or third-party packages such as `dpdata`, ASE, LAMMPS, AMBER, Gromacs, i-PI, CP2K, OpenMM, ABACUS, and others.
77+
78+
```markdown
79+
# In LAMMPS file:
80+
81+
# Run LAMMPS using a TensorFlow backend model
82+
pair_style deepmd frozen_model.pb
83+
84+
# Run LAMMPS using a PyTorch backend model
85+
pair_style deepmd frozen_model.pth
86+
87+
# Run LAMMPS using a JAX backend model
88+
pair_style deepmd frozen_model.savedmodel
89+
90+
# Calculate model deviation using two or more models
91+
pair_style deepmd frozen_model.pb frozen_model.pth frozen_model.savedmodel o
92+
```
93+
- Adding new backends to DeePMD-kit has become faster and more streamlined.
94+
DP-GEN has also released a new version, **v0.13.0**, which supports DeePMD-kit’s multi-backend functionality through the `train_backend` parameter (can be set to `tensorflow` or `pytorch`).
95+
96+
## 2. DPA-2 Model: A General Large-Atom Model for Molecular and Material Simulations
97+
98+
The DPA-2 model provides a robust architecture for large-atom models ([link](https://www.aissquare.com/openlam)), enabling highly accurate representation of various chemical systems for high-quality simulations. In version 3.0.0, DPA-2 supports single-task or multi-task training on the PyTorch backend and inference using the JAX backend.
99+
100+
The DPA-2 descriptor consists of two modules: **repinit** and **repformer**, as shown in the figure below.
101+
102+
<center><img src=https://dp-public.oss-cn-beijing.aliyuncs.com/community/dpv3/dp2.png# pic_center width="100%" height="100%" /></center>
103+
104+
The PyTorch backend supports training strategies required for large-atom models, including:
105+
106+
- **Parallel Training**: Train large-atom models across multiple GPUs to improve efficiency.
107+
108+
```
109+
torchrun --nproc_per_node=4 --no-python dp --pt train input.json
110+
```
111+
112+
- **Multi-task Training**: Share descriptors across diverse datasets computed using different DFT methods to train large-atom models.
113+
- **Fine-tuning**: Train pre-trained large-atom models on smaller, task-specific datasets. The PyTorch backend supports the `--finetune` parameter in the `dp --pt train` command line.
114+
115+
## 3. Plugin Mechanism: Connecting DeePMD-kit with External Models
116+
117+
Version 3.0.0 significantly enhances the plugin mechanism, enabling the development or integration of potential energy models using TensorFlow, PyTorch, or JAX frameworks. This allows users to leverage DeePMD-kit's training modules, loss functions, and various interfaces.
118+
119+
Here is an example of a plugin package: [deepmd-gnn](https://github.com/njzjz/deepmd-gnn). The **deepmd-gnn** plugin supports training MACE and NequIP models within DeePMD-kit using familiar `dp` commands.
120+
121+
```markdown
122+
# after installing deepmd-gnn
123+
dp --pt train mace.json
124+
dp --pt freeze
125+
dp --pt test -m frozen_model.pth -s ../data/
126+
```
127+
<center><img src=https://dp-public.oss-cn-beijing.aliyuncs.com/community/dpv3/dp3.png# pic_center width="100%" height="100%" /></center>
128+
129+
## 4. More functions
130+
**Descriptor se_e3_tebd**
131+
- A descriptor designed for fitting arbitrary properties of a system.
132+
133+
**New Training Parameters**:
134+
- **max_ckpt_keep**: Specifies the maximum number of checkpoints to keep.
135+
- **change_bias_after_training**: Allows adjustment of biases after training.
136+
- **stat_file**: Provides a file for recording statistical information during training.
137+
138+
**New Command-Line Interfaces**:
139+
- **`dp change-bias`**: Modify model biases post-training.
140+
- **`dp show`**: Display model details or parameters.
141+
142+
**Enhanced JSON Input File Support in VSCode**:
143+
- View parameter documentation directly in JSON input files.
144+
- Check parameters for correctness within VSCode.
145+
146+
**Support for Latest LAMMPS Version**:
147+
- Compatible with **stable_29Aug2024_update1**.
148+
The PyTorch backend for DeePMD-kit was initially developed in the **deepmd-pytorch** project and was later fully migrated to the **deepmd-kit** project.
149+
150+
## 5. Contributors to the **deepmd-pytorch** Project:
151+
Chenqqian Zhang, Chun Cai, Duo Zhang, Guolin Ke, Han Wang, Hangrui Bi, Jinzhe Zeng, Junhan Chang, Xiangyu Zhang, Shaochen Shi, Yifan Li, Yiming Du, Zhaohan Ding, Xuejian Qin, Xinzijian Liu
152+
153+
Contributors to the **deepmd-kit** Project (after branching out for v3 development):
154+
Anyang Peng, Chenqqian Zhang, Chenxing Luo, Chun Cai, Duo Zhang, Han Wang, Jia-Xin Zhu, Jinzhe Zeng, Pinghui Mo, Ruosong Shi, Sensen He, Sigbjørn Løland Bore, Xiangyu Zhang, Yan Wang, Yifan Li, Yiming Du, Yong-Bin Zhuang, Yunpei Liu, Zhangmancang Xu, Zhe Deng, Zhengtao Huang, Zhenyu Wang
155+
156+
We would also like to thank everyone who participated in testing and bug reporting over the past eight months.
157+
158+
Version Release Notes & Offline Package Download:
159+
[https://github.com/deepmodeling/deepmd-kit/releases/tag/v3.0.0](https://github.com/deepmodeling/deepmd-kit/releases/tag/v3.0.0)
160+
161+
Documentation:
162+
[https://docs.deepmodeling.com/projects/deepmd/en/v3.0.0/](https://docs.deepmodeling.com/projects/deepmd/en/v3.0.0/)
163+
164+
## 6. Join the Team of **Jinzhe Zeng**!
165+
166+
The author of this article and one of the main contributors to DeePMD-kit, **Jinzhe Zeng**, earned a bachelor's degree in Chemistry from East China Normal University in July 2019. He is expected to receive his Ph.D. in Chemistry from Rutgers University in January 2025. Afterward, he will join the **School of Artificial Intelligence and Data Science** at the University of Science and Technology of China (working in Suzhou) as a tenure-track associate professor and will establish his own research team.
167+
168+
The team will welcome outstanding undergraduate students to apply for graduate positions. One of the research directions will focus on the development of DeePMD-kit.
169+
170+
**Contact**: [[email protected]](mailto:[email protected])

0 commit comments

Comments
 (0)