Skip to content
Closed
Show file tree
Hide file tree
Changes from 250 commits
Commits
Show all changes
345 commits
Select commit Hold shift + click to select a range
024372e
Merge branch 'deepmodeling:master' into master
felix5572 May 19, 2021
f06e4f9
fix bug in docs display
felix5572 May 19, 2021
71b0996
update docs
felix5572 May 19, 2021
ad80c18
Merge pull request #4 from felix5572/master
amcadmus May 19, 2021
efb36b7
modify gdi.py
felix5572 May 20, 2021
b7130de
update dpti gdi airflow
felix5572 May 21, 2021
fef222f
update airflow-workflow
felix5572 May 21, 2021
7c0a5d5
update dpti
felix5572 May 22, 2021
cfa7d71
update gdi workflow
felix5572 May 22, 2021
eb846f2
fix bug in gdi workflow
felix5572 May 22, 2021
b8420ae
fix unittest
felix5572 May 22, 2021
5a9263d
update gdi
felix5572 May 22, 2021
7ec9cf7
update gdi -airflow
felix5572 May 22, 2021
e057cc5
update gdi
felix5572 May 22, 2021
f7cf1df
update gdi
felix5572 May 22, 2021
45cbec7
update gdi.py
felix5572 May 22, 2021
9b31c27
update gdi workflow
felix5572 May 22, 2021
b5aa53b
update gdi.py
felix5572 May 22, 2021
1bbed62
update gdi.py for airflow
felix5572 May 22, 2021
5075455
will not print debug information
felix5572 May 22, 2021
066cc50
Merge remote-tracking branch 'origin/master'
felix5572 May 22, 2021
3e1aabc
reset changes of format
felix5572 May 22, 2021
ab7583d
clear more debug print
felix5572 May 22, 2021
9693f59
update gdi.py for dargs
felix5572 May 23, 2021
86e167d
update setup.py for dargs
felix5572 May 23, 2021
6384958
update gdi.py for dargs
felix5572 May 23, 2021
6fd1225
update airflow-gdi; use separate airflow-graph's node for MD tasks
felix5572 May 23, 2021
fa9384d
update for airflow gdi.py; use two seperate node to run MD simulations
felix5572 May 23, 2021
f2c2d92
change tasks to task_list; add airflow retries
felix5572 May 24, 2021
4ad0c62
fix bug of task_list
felix5572 May 25, 2021
ca64639
gdi.py airflow main will fail when sub dag fail
felix5572 May 25, 2021
afbd308
Merge pull request #5 from felix5572/master
amcadmus May 25, 2021
bc75ba3
Create LICENSE
njzjz May 28, 2021
78cfc8f
Merge pull request #6 from njzjz/patch-1
amcadmus May 29, 2021
9f44d6b
better print informaiton for hti
felix5572 Jun 25, 2021
02bd08b
support set pv and pv_err value manually for hti_ice.py
felix5572 Jun 25, 2021
e6e9f5d
support set pv and pv_err value manually for hti_water.py
felix5572 Jun 25, 2021
a72843b
clean dpti ti.py numberial integration not used now
felix5572 Jun 25, 2021
1882b85
fix module import
felix5572 Jun 25, 2021
846a89b
update field for hti_water.py
felix5572 Jun 25, 2021
42854a5
add funciton exec_args for function execution; convenient for unittest
felix5572 Jun 25, 2021
8d6aecc
Merge remote-tracking branch 'origin/master'
felix5572 Jun 25, 2021
65ee5e8
use with open
felix5572 Jun 25, 2021
e7b0671
add unittest for hti_ice.py and hti_water.py
felix5572 Jun 25, 2021
292f8a9
fix unittest
felix5572 Jun 25, 2021
8a2ff8c
update ti.py; change thermo result type from list to np.ndarray
felix5572 Jun 28, 2021
443e670
Merge remote-tracking branch 'origin/master'
felix5572 Jun 28, 2021
d34d16a
better ti.py output
felix5572 Jun 29, 2021
c055420
update gdi.py to support new dpdispatcher
felix5572 Jun 30, 2021
2718508
fix bug when reading files
felix5572 Jun 30, 2021
bee7c83
update ti.py dump json format
felix5572 Jun 30, 2021
1162ccb
update ti_water.py json dump
felix5572 Jun 30, 2021
2318c09
update field in hti.py compatible with old
felix5572 Jul 1, 2021
99ed3dc
update equi.py args check
felix5572 Jul 1, 2021
61bd829
add function create_dict_not_empty_key and get_first_matched_key_from…
felix5572 Jul 1, 2021
9e4b5a3
clean duplicated field in benchmark_equi/npt/jdata.json
felix5572 Jul 1, 2021
29d7b7a
update git ignore
felix5572 Jul 1, 2021
7467a85
update compatibility; support old fields
felix5572 Jul 1, 2021
fde752d
update setup.py fix install
felix5572 Jul 1, 2021
251ae3a
update to support fields in old json relating to water
felix5572 Jul 1, 2021
cc465b0
support old jsons relating to water
felix5572 Jul 1, 2021
55a2c29
support old json format relating to water
felix5572 Jul 1, 2021
221445d
update field is_water
felix5572 Jul 1, 2021
2a7b0e1
update variable name in ti.py
felix5572 Jul 1, 2021
edee593
compatible with old ti json file
felix5572 Jul 1, 2021
fd73f12
update field compatible with old ti json field
felix5572 Jul 1, 2021
ae3434c
add default value for key if_meam
felix5572 Jul 1, 2021
94fce85
Merge remote-tracking branch 'origin/master'
felix5572 Jul 1, 2021
7808bd8
update ti_water.py for better unittest
felix5572 Jul 1, 2021
3341a14
fix bug; will add element_num
felix5572 Jul 1, 2021
ab8ff64
update key compatibility
felix5572 Jul 1, 2021
8240239
fix package import
felix5572 Jul 1, 2021
71e19ae
add unittest for ti_water.py; support field in old jsons
felix5572 Jul 1, 2021
82be973
update unittest for equi water calculation
felix5572 Jul 1, 2021
a72d34f
use mock conf file for hti_ice.py
felix5572 Jul 2, 2021
b061de4
update dpdispatcher in workflow
felix5572 Jul 2, 2021
77d0898
Merge pull request #7 from felix5572/master
amcadmus Jul 2, 2021
b206e6c
add github action auto release to pypi
felix5572 Jul 7, 2021
6d41dd4
update setup.py
felix5572 Jul 7, 2021
8ec85ac
Merge pull request #8 from felix5572/master
amcadmus Jul 8, 2021
5648868
write info of hti_ice compute to result.json
mh-guo Jul 10, 2021
1073723
prevent resources leaking
mh-guo Jul 11, 2021
8ba66cb
Merge pull request #9 from mh-guo/master
amcadmus Jul 11, 2021
831b8b5
update mock file
felix5572 Jul 14, 2021
c2fa300
update mock file in unittest
felix5572 Jul 14, 2021
835d0f4
Merge branch 'deepmodeling:master' into master
felix5572 Jul 14, 2021
d8266ab
transfer "dump.equi" from remote to local
Feiyang472 Jul 14, 2021
59601cc
Merge pull request #10 from felix5572/master
amcadmus Jul 17, 2021
1ebd52b
Merge pull request #11 from Feiyang472/patch-1
amcadmus Jul 17, 2021
a066c98
Create DpFreeEnergyWater.py
mh-guo Jul 30, 2021
6e7a6b9
Merge pull request #12 from mh-guo/master
amcadmus Jul 31, 2021
7484f50
devel fix small bugs
felix5572 Oct 11, 2021
334c364
update conda CICD
felix5572 Oct 12, 2021
b3218e3
fix unittest
felix5572 Oct 12, 2021
5777439
fix conda CICD
felix5572 Oct 12, 2021
84a66c3
Fix wrong column index of enenthalpy.
shishaochen Oct 27, 2021
487d28d
Fix Json parsing error.
shishaochen Oct 27, 2021
f11764b
Fix `if_meam` missing in Json config.
shishaochen Oct 27, 2021
1b61936
Merge pull request #14 from shishaochen/fix
wanghan-iapcm Oct 28, 2021
9db9f0b
Merge branch 'deepmodeling:master' into master
felix5572 Oct 28, 2021
a2e50c0
Allow to use relative path of input LAMMPS data.
shishaochen Nov 4, 2021
7021421
Merge pull request #15 from shishaochen/fix
wanghan-iapcm Nov 4, 2021
8fb7b2f
Fix out.lmp not downloaded.
shishaochen Nov 8, 2021
ba5c1e7
Allow relative model path in TI stage.
shishaochen Nov 9, 2021
7522ac0
Dump absolute path of equi_conf & model to ti_settings.
shishaochen Nov 9, 2021
3c5c86b
Merge pull request #16 from shishaochen/fix
wanghan-iapcm Nov 10, 2021
f53a986
Merge branch 'deepmodeling:master' into master
felix5572 Feb 16, 2022
edff7de
update airflow workflow
felix5572 Feb 16, 2022
23623ee
Merge pull request #20 from felix5572/master
wanghan-iapcm Feb 16, 2022
d8d3189
update gdi examples
felix5572 Jun 9, 2022
7ae81d5
update fields in machine.json
felix5572 Jun 9, 2022
5cd091b
Merge pull request #24 from felix5572/master
amcadmus Jul 4, 2022
8265c72
Update old_equi.py (#28)
Chengqian-Zhang Jul 11, 2022
ea6452c
update pair_coeff (#30)
Chengqian-Zhang Jul 13, 2022
c362aab
fix codecov (#31)
njzjz Nov 16, 2022
0b7cdc3
Update ti.py | enhancing data precision (#32)
Vibsteamer Jul 5, 2023
7f61b60
add command line interface (#33)
Yi-FanLi Dec 12, 2023
43aa25d
add pre-commit configs (#35)
njzjz Dec 14, 2023
92af5d3
lib/water.py: correct making bond list of water (#36)
Yi-FanLi Dec 15, 2023
3443bc8
add document (#34)
Yi-FanLi Dec 16, 2023
2cb1053
raise warning for large stat_bsize value (#37)
Yi-FanLi Dec 20, 2023
2f0c6c4
Support dump_freq keyword (#38)
Yi-FanLi Dec 26, 2023
df15ddc
use HTI's result.json file in TI's CLI (#39)
Yi-FanLi Dec 26, 2023
224b191
fix hti's output of Helmholtz free energy for water
Yi-FanLi Dec 29, 2023
4d98a04
fix hti's output of Helmholtz free energy for water (#40)
Yi-FanLi Dec 29, 2023
590fcfc
Read HTI's target temperature and use it as To in TI (#42)
Yi-FanLi Jan 2, 2024
abf3fec
fix merge conflict
Yi-FanLi Jan 2, 2024
551ba03
Fix the bug for the Langevin thermostat in HTI (#43)
Yi-FanLi Jan 4, 2024
78fe73c
HTI: fix three-step lambda=1 error (#44)
Yi-FanLi Jan 5, 2024
20c9188
support using P as control variable in Gibbs free energy (#41)
Yi-FanLi Jan 9, 2024
2a30479
Merge branch 'deepmodeling:devel' into devel
Yi-FanLi Jan 9, 2024
f2388dd
Support running jobs through CLI (#45)
Yi-FanLi Jan 15, 2024
79811b9
Merge branch 'devel' of github.com:Yi-FanLi/dpti into devel
Yi-FanLi Jan 15, 2024
881ccec
fix gdi calculation y ndarray (#53)
felix5572 Feb 22, 2024
910e359
add more instructions, software usage and commands about dpti (#52)
felix5572 Feb 22, 2024
60502c6
Add examples for Sn high pressure and each calculation module (#51)
felix5572 Feb 22, 2024
db55d2c
update docker support; add docs and command instructions (#50)
felix5572 Feb 22, 2024
3ec59d5
Add mti (Mass Thermodynamic Integration) (#48)
Yi-FanLi Apr 3, 2024
4156c17
CI: Setup Dependabot for GitHub Actions (#56)
njzjz-bot Apr 3, 2024
691c9bc
CI: Bump codecov/codecov-action from v3 to v4 (#55)
njzjz-bot Apr 3, 2024
b66711b
Support custom command for the run_task functions (#61)
Yi-FanLi Apr 6, 2024
9d3a857
support template forcefiled for mti (#60)
Yi-FanLi Apr 8, 2024
9787d4e
CI: Mirror the repository to Gitee (#63)
njzjz-bot Apr 11, 2024
9138f70
change TI MD final written frame file to final.lmp (#64)
Liu-RX Apr 13, 2024
cc882fb
feat(build): Add Git archives version files (#65)
njzjz-bot Apr 13, 2024
e4f79c4
fix: remove ref-names from .git_archival.txt (#66)
njzjz-bot Jul 16, 2024
af26a76
docs: pin sphinx-argparse to < 0.5.0 (#67)
njzjz-bot Jul 16, 2024
a07e9bc
correct error estimation: divide natoms in stead of np.sqrt(natoms)
Yi-FanLi Jul 19, 2024
8db82a2
correct error estimation: divide natoms in stead of np.sqrt(natoms) (…
Yi-FanLi Jul 19, 2024
dbf3b46
Merge branch 'deepmodeling:devel' into devel
Yi-FanLi Jul 19, 2024
8c29f7d
Merge branch 'devel' of github.com:Yi-FanLi/dpti into devel
Yi-FanLi Jul 22, 2024
74c11c5
Bump actions/setup-python from 2 to 5 (#57)
dependabot[bot] Aug 4, 2024
c042df5
Bump actions/checkout from 2 to 4 (#58)
dependabot[bot] Aug 4, 2024
c81ae48
[pre-commit.ci] pre-commit autoupdate (#62)
pre-commit-ci[bot] Aug 4, 2024
7e892a6
Merge branch 'deepmodeling:devel' into devel
Yi-FanLi Aug 5, 2024
3c3a370
Improve error message when switching is not correctly set (#72)
Yi-FanLi Aug 5, 2024
e22ef9b
Support custom variables for the LAMMPS input files (#70)
Yi-FanLi Aug 6, 2024
f99465e
use P as control variable in Gibbs free energy for hti (#74)
Yi-FanLi Aug 6, 2024
14a981f
Ti change e0 default value to None (#75)
GEODONG Aug 6, 2024
621bb56
Merge branch 'deepmodeling:devel' into devel
Yi-FanLi Aug 6, 2024
c3ecb61
Improve the help message for the one-step switch HTI case of parser_r…
Yi-FanLi Aug 6, 2024
db59da6
use lammps thermo float to specify format for 'ke pe etotal enthalpy…
felix5572 Aug 9, 2024
1200e91
use lammps thermo float to specify format for 'ke pe etotal enthalpy…
felix5572 Aug 9, 2024
026ab27
first implementation for dpti on prefect framework
felix5572 Aug 9, 2024
b2ec5e4
first implementation for dpti on prefect framework
felix5572 Aug 9, 2024
4f35af1
add prefect dpdispatcher support
felix5572 Aug 9, 2024
3c9fbd9
add prefect dpdispatcher support
felix5572 Aug 9, 2024
d0076f9
install default example files
felix5572 Aug 9, 2024
ca540d1
install default example files
felix5572 Aug 9, 2024
e6a8cab
fix UT
felix5572 Aug 9, 2024
5d8bbc2
fix UT
felix5572 Aug 9, 2024
42ae24b
add default prefect usage instruction
felix5572 Aug 9, 2024
0b9a1d0
add default prefect usage instruction
felix5572 Aug 9, 2024
b568940
Merge branch 'devel' of https://mirror.ghproxy.com/https://github.com…
felix5572 Aug 9, 2024
d046593
Merge branch 'devel' of https://mirror.ghproxy.com/https://github.com…
felix5572 Aug 9, 2024
19c81d5
Merge branch 'deepmodeling:devel' into devel
Yi-FanLi Aug 14, 2024
3360eb3
ti: read the pressure at the starting point on a p path (#77)
Yi-FanLi Aug 14, 2024
dedc736
[pre-commit.ci] pre-commit autoupdate
pre-commit-ci[bot] Nov 4, 2024
5cf792a
docs: replace sphinx-rtd-theme with sphinx-book-theme (#79)
njzjz-bot Nov 10, 2024
67ebf8f
Merge branch 'deepmodeling:devel' into devel
Yi-FanLi Nov 10, 2024
1a2334d
pass element_num to sparam in hti_liq
Yi-FanLi Nov 10, 2024
8beb7cc
pass element_num to sparam in hti_liq (#81)
Yi-FanLi Nov 10, 2024
ebd0a88
Merge pull request #73 from deepmodeling/pre-commit-ci-update-config
Yi-FanLi Nov 11, 2024
68710aa
Bump codecov/codecov-action from 4 to 5 (#83)
dependabot[bot] Nov 20, 2024
67449f7
add an example json file for mti
Yi-FanLi Dec 30, 2024
c95a0dc
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Dec 30, 2024
04c22f7
Merge pull request #85 from Yi-FanLi/mti_example
Yi-FanLi Dec 30, 2024
bb98509
use temp instead of temp_list[ii] in _gen_lammps_input to avoid error…
Yi-FanLi Dec 30, 2024
2609384
add an example json file for mti with p path
Yi-FanLi Dec 30, 2024
565cf0c
Merge pull request #86 from Yi-FanLi/mti_p_path
Yi-FanLi Dec 30, 2024
76fe53a
Merge branch 'deepmodeling:devel' into devel
felix5572 Jan 22, 2025
3bc24bd
Merge branch 'deepmodeling:devel' into devel
felix5572 Jan 22, 2025
898c4fc
Whole TI calculation prefect workflow
felix5572 Jan 27, 2025
de6ae21
Whole TI calculation prefect workflow
felix5572 Jan 27, 2025
8530fce
Split workflow code into multiple files
felix5572 Jan 27, 2025
8f2cec4
Split workflow code into multiple files
felix5572 Jan 27, 2025
594b0a1
extract out workflow
felix5572 Jan 27, 2025
564017d
extract out workflow
felix5572 Jan 27, 2025
95b1403
add confest.py for pytest
felix5572 Jan 27, 2025
3747fd5
add confest.py for pytest
felix5572 Jan 27, 2025
623dc23
add UT for job_executor and file_handler
felix5572 Jan 27, 2025
ad50f5e
add UT for job_executor and file_handler
felix5572 Jan 27, 2025
dd021a9
fix dumped in.json for hti_liq (#87)
Yi-FanLi Mar 9, 2025
8087253
parse multiple epsilon values for lj/cut/soft (#88)
Yi-FanLi Mar 10, 2025
40932f8
rewrite dpti workflow management and implement for prefect framework
felix5572 Mar 15, 2025
f061a9b
rewrite dpti workflow management and implement for prefect framework
felix5572 Mar 15, 2025
aefca89
clean old implementation
felix5572 Mar 15, 2025
1d84c8b
clean old implementation
felix5572 Mar 15, 2025
63864e3
small updates and bug fix for workflow implementation convenience
felix5572 Mar 15, 2025
238fd16
small updates and bug fix for workflow implementation convenience
felix5572 Mar 15, 2025
e76e956
clean unused code and notes
felix5572 Mar 15, 2025
1b29855
clean unused code and notes
felix5572 Mar 15, 2025
e844bdc
Merge branch 'deepmodeling:devel' into devel
felix5572 Mar 15, 2025
c76b61f
Merge branch 'deepmodeling:devel' into devel
felix5572 Mar 15, 2025
4c440a6
Merge branch 'devel' of https://github.com/felix5572/dpti into devel
felix5572 Mar 15, 2025
a74205a
Merge branch 'devel' of https://github.com/felix5572/dpti into devel
felix5572 Mar 15, 2025
70a4990
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 15, 2025
347d0ac
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 15, 2025
c408440
add example Sn high pressure
felix5572 Mar 15, 2025
8a0e679
add example Sn high pressure
felix5572 Mar 15, 2025
d8ba6fa
equi: write average position in out.lmp when if_dump_avg_posi=True (#90)
Yi-FanLi Mar 15, 2025
a25c159
add water ice example
felix5572 Mar 15, 2025
bec938a
add water ice example
felix5572 Mar 15, 2025
f090051
result base class implementation
felix5572 Mar 15, 2025
e293467
result base class implementation
felix5572 Mar 15, 2025
e60f63a
init result analyze rebulid free energy surface and compare ti result
felix5572 Mar 15, 2025
f505d72
init result analyze rebulid free energy surface and compare ti result
felix5572 Mar 15, 2025
52f9070
for IDE cursor use
felix5572 Mar 15, 2025
0cee3bb
for IDE cursor use
felix5572 Mar 15, 2025
d1a9c5c
use python abs import
felix5572 Mar 15, 2025
ebf99d8
use python abs import
felix5572 Mar 15, 2025
398adcf
init pytest unittest
felix5572 Mar 15, 2025
ac7cb1d
init pytest unittest
felix5572 Mar 15, 2025
fb429ed
add pytest unittest cli note
felix5572 Mar 15, 2025
b831d34
add pytest unittest cli note
felix5572 Mar 15, 2025
b644ded
Merge branch 'deepmodeling:devel' into devel
felix5572 Mar 15, 2025
8e3a1c6
Merge branch 'deepmodeling:devel' into devel
felix5572 Mar 15, 2025
9ba1d35
resolve conflicts
felix5572 Mar 15, 2025
ade6c26
resolve conflicts
felix5572 Mar 15, 2025
1255a30
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 15, 2025
e572496
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 15, 2025
0dac766
update Sn and water example conf file
felix5572 Mar 15, 2025
34b86be
update Sn and water example conf file
felix5572 Mar 15, 2025
e694a19
Merge branch 'devel' of https://github.com/felix5572/dpti into devel
felix5572 Mar 26, 2025
1327d04
add ai tools for dpti
felix5572 Mar 26, 2025
a883b79
Merge branch 'devel' of https://github.com/felix5572/dpti into devel
felix5572 Mar 26, 2025
483544d
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Mar 26, 2025
6b85bd3
Merge branch 'devel' of https://github.com/felix5572/dpti into devel
felix5572 Mar 26, 2025
d374484
devel fix bugs
felix5572 May 23, 2025
c05c9fd
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] May 23, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
61 changes: 61 additions & 0 deletions .cursor/rules/dpti-default-ai-rules.mdc
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
---
description: General Rules
globs: *py,*json,*yaml,*ipynb,*sh,*md,*txt,*.lammps,*tex,*toml,*yml
alwaysApply: false
---

<!-- - You can @ files here
- You can use markdown but dont have to -->
[base.py](mdc:dpti/workflows/simulations/base.py) [equi.py](mdc:dpti/equi.py) [equi_sim.py](mdc:dpti/workflows/simulations/equi_sim.py) [ti.py](mdc:dpti/ti.py) [mti.py](mdc:dpti/mti.py) [ti_sim.py](mdc:dpti/workflows/simulations/ti_sim.py) [hti.py](mdc:dpti/hti.py) [hti_liq.py](mdc:dpti/hti_liq.py) [hti_ice.py](mdc:dpti/hti_ice.py) [hti_water.py](mdc:dpti/hti_water.py) [hti_sim.py](mdc:dpti/workflows/simulations/hti_sim.py) [einstein.py](mdc:dpti/einstein.py) [di.py](mdc:dpti/workflows/service/di.py) [file_handler.py](mdc:dpti/workflows/service/file_handler.py) [job_executor.py](mdc:dpti/workflows/service/job_executor.py) [ti_line_flow.py](mdc:dpti/workflows/flows/ti_line_flow.py)


# 角色
你是一名精通科学计算和分子动力学模拟的高级工程师,拥有10年以上的Python科学计算应用开发经验,熟悉LAMMPS、DeepMD-kit、Prefect等开发工具和技术栈。你的任务是帮助用户设计和开发易用且易于维护的分子动力学自由能计算应用。始终遵循科学计算领域的最佳实践,并坚持代码可重复性和计算效率的原则。
# 目标
你的目标是以用户容易理解的方式帮助他们完成分子动力学自由能计算应用的设计和开发工作,确保计算结果准确可靠、性能高#效、用户使用便捷。
# 要求
在理解用户需求、设计计算流程、编写代码、解决问题和项目迭代优化时,你应该始终遵循以下原则:
项目初始化
仔细阅读项目的README.md,理解自由能计算的基本原理、计算流程和技术实现方案
如果没有README.md,创建一个包含计算理论基础、输入输出格式、依赖环境等信息的文档
# 需求理解
理解用户的具体计算需求,包括体系类型、温度压力条件、所需精度等
选择合适的热力学积分方法,避免不必要的计算
# 计算流程设计
使用Prefect设计可重复、可追踪的计算工作流
确保计算任务在不同计算环境下都能正确运行
# 代码编写
技术选型:
Python科学计算库(numpy, scipy)用于数据处理和分析
LAMMPS用于分子动力学模拟
Prefect 框架用于工作流管理
DeepMD-kit用于机器学习势能计算
# 代码结构:
模块化设计,将计算、分析、可视化等功能分离
使用面向对象编程,提高代码复用性
遵循Python科学计算代码规范
# 计算安全性:
检查输入参数的合法性
保存计算中间结果,支持断点续算
验证计算结果的物理合理性
# 性能优化:
优化并行计算策略
减少不必要的IO操作
合理使用内存缓存
# 测试与文档:
编写单元测试验证计算正确性
提供详细的API文档和使用示例
添加计算原理和方法的说明
# 问题解决
全面理解分子动力学模拟和自由能计算的原理
根据计算结果的异常分析问题原因
确保代码修改不影响计算的物理正确性
# 迭代优化
收集用户反馈,优化计算流程和接口设计
及时更新文档,包括新增功能和优化建议
持续改进计算性能和稳定性
# 方法论
1.Unix编程风格,开源软件设计风格,恰当根据项目或者模块需要,使用设计模式
2.坚持dont repeat your self 原则
3.在软件架构上给出成熟考虑。
2.回答问题时,也给出关于设计模式上的思考,提示用户相关联的最佳实践(如果有)。
163 changes: 163 additions & 0 deletions dpti/aitools/codebase/collect_py_files_context.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,163 @@
# %%

import argparse
import json
from pathlib import Path
from typing import Dict
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Update deprecated type annotations.

The typing.Dict import is deprecated in favor of the built-in dict type (Python 3.9+).

-from typing import Dict
+from typing import Dict  # Remove this line entirely

Committable suggestion skipped: line range outside the PR's diff.

🧰 Tools
🪛 Ruff (0.11.9)

6-6: typing.Dict is deprecated, use dict instead

(UP035)

🤖 Prompt for AI Agents
In dpti/aitools/codebase/collect_py_files_context.py at line 6, replace the
import of the deprecated typing.Dict with the built-in dict type. Remove the
line importing Dict from typing and update any type annotations using Dict to
use the built-in dict instead.



def collect_python_files(directory: str) -> Dict[str, str]:
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Replace deprecated Dict with built-in dict type.

Static analysis correctly identifies that typing.Dict is deprecated. Update all type annotations to use the built-in dict type.

-def collect_python_files(directory: str) -> Dict[str, str]:
+def collect_python_files(directory: str) -> dict[str, str]:

-def group_by_directory(
-    files: Dict[str, str], max_files_per_group: int = 20
-) -> Dict[str, Dict[str, str]]:
+def group_by_directory(
+    files: dict[str, str], max_files_per_group: int = 20
+) -> dict[str, dict[str, str]]:

-def save_grouped_json(groups: Dict[str, Dict[str, str]], output_base: str) -> None:
+def save_grouped_json(groups: dict[str, dict[str, str]], output_base: str) -> None:

Also applies to: 46-47, 96-96

🧰 Tools
🪛 Ruff (0.11.9)

9-9: Use dict instead of Dict for type annotation

Replace with dict

(UP006)

🤖 Prompt for AI Agents
In dpti/aitools/codebase/collect_py_files_context.py at lines 9, 46-47, and 96,
replace all occurrences of the deprecated typing.Dict type annotation with the
built-in dict type. Update the function signatures and any variable annotations
accordingly to use dict instead of Dict for better compatibility and to adhere
to current Python typing standards.

"""
Collect paths and contents of all Python files in the specified directory.

Args:
directory: Directory path to scan

Returns
-------
Dictionary containing file paths and their contents
"""
python_files = {}
base_dir = Path(directory).name

# Use Path object for directory traversal
for file_path in Path(directory).rglob("*.py"):
# Skip __pycache__ directory
if "__pycache__" in str(file_path):
continue

try:
# Read file content
with open(file_path, encoding="utf-8") as f:
content = f.read()

# Convert to relative path and include base directory
relative_path = str(file_path.relative_to(directory))
full_path = f"{base_dir}/{relative_path}"
python_files[full_path] = content

except Exception as e:
print(f"Error reading file {file_path}: {e!s}")

return python_files


def group_by_directory(
files: Dict[str, str], max_files_per_group: int = 20
) -> Dict[str, Dict[str, str]]:
"""
Group files by their directory structure.

Args:
files: Dictionary of file paths and contents
max_files_per_group: Maximum number of files per group

Returns
-------
Dictionary of grouped files
"""
# First group by directory
dir_groups = {}
for file_path, content in files.items():
dir_name = str(Path(file_path).parent)
if dir_name not in dir_groups:
dir_groups[dir_name] = {}
dir_groups[dir_name][file_path] = content

# Merge small groups and split large groups
final_groups = {}
current_group = {}
current_group_size = 0
group_index = 1

for dir_name, dir_files in dir_groups.items():
# If adding this directory's files would exceed the limit
if (
current_group_size + len(dir_files) > max_files_per_group
and current_group_size > 0
):
# Save current group and start a new one
final_groups[f"group_{group_index}"] = current_group
current_group = {}
current_group_size = 0
group_index += 1

# Add files to current group
current_group.update(dir_files)
current_group_size += len(dir_files)

# Don't forget to save the last group
if current_group:
final_groups[f"group_{group_index}"] = current_group

return final_groups


def save_grouped_json(groups: Dict[str, Dict[str, str]], output_base: str) -> None:
"""
Save each group to a separate JSON file.

Args:
groups: Grouped files dictionary
output_base: Base name for output files
"""
output_base = Path(output_base)
base_name = output_base.stem
parent_dir = output_base.parent

Comment on lines +104 to +107
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add validation for output directory writability.

The function doesn't verify that the output directory exists or is writable before attempting to save files.

     output_base = Path(output_base)
     base_name = output_base.stem
     parent_dir = output_base.parent
+    
+    # Ensure output directory exists and is writable
+    try:
+        parent_dir.mkdir(parents=True, exist_ok=True)
+    except Exception as e:
+        print(f"Error creating output directory {parent_dir}: {e}")
+        return
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
output_base = Path(output_base)
base_name = output_base.stem
parent_dir = output_base.parent
output_base = Path(output_base)
base_name = output_base.stem
parent_dir = output_base.parent
# Ensure output directory exists and is writable
try:
parent_dir.mkdir(parents=True, exist_ok=True)
except Exception as e:
print(f"Error creating output directory {parent_dir}: {e}")
return
🤖 Prompt for AI Agents
In dpti/aitools/codebase/collect_py_files_context.py around lines 104 to 107,
add validation to check if the output directory exists and is writable before
proceeding. Use appropriate methods to verify the directory's existence and
permissions, and handle cases where the directory is missing or not writable by
raising an error or creating the directory as needed.

for group_name, group_files in groups.items():
output_file = parent_dir / f"{base_name}_{group_name}.json"
wrapper = {
"project_files": {
"description": f"Python source files collection - {group_name}",
"base_directory": Path().absolute().name,
"files": group_files,
}
}

with open(output_file, "w", encoding="utf-8") as f:
json.dump(wrapper, f, ensure_ascii=False, indent=2)
print(f"Saved {len(group_files)} files to {output_file}")


def main():
parser = argparse.ArgumentParser(
description="Collect Python files content into JSON for context"
)
parser.add_argument(
"-d",
"--directory",
default="../../../dpti/",
help="Directory to scan (default: current directory)",
)
Comment on lines +127 to +132
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Avoid hardcoded relative paths in default arguments

The default directory path "../../../dpti/" is brittle and may break depending on where the script is executed from. Consider using a more robust approach.

 parser.add_argument(
     "-d",
     "--directory",
-    default="../../../dpti/",
+    default=".",
     help="Directory to scan (default: current directory)",
 )
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
parser.add_argument(
"-d",
"--directory",
default="../../../dpti/",
help="Directory to scan (default: current directory)",
)
parser.add_argument(
"-d",
"--directory",
default=".",
help="Directory to scan (default: current directory)",
)

parser.add_argument(
"-o",
"--output",
default="build/python_files_context.json",
help="Output JSON file base name (default: python_files_context.json)",
)
parser.add_argument(
"-n",
"--num-files",
type=int,
default=20,
help="Maximum number of files per group (default: 20)",
)

args = parser.parse_args()

# Collect Python files
python_files = collect_python_files(args.directory)

# Group files
groups = group_by_directory(python_files, args.num_files)

# Save grouped files
save_grouped_json(groups, args.output)

print(f"\nTotal files collected: {len(python_files)}")
print(f"Split into {len(groups)} groups")
Comment on lines +148 to +159
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add validation and error handling for input directory

The code doesn't check if the specified directory exists before attempting to scan it. Consider adding directory validation.

 args = parser.parse_args()

+# Validate input directory
+input_dir = Path(args.directory)
+if not input_dir.exists() or not input_dir.is_dir():
+    print(f"Error: Directory '{args.directory}' does not exist or is not a directory")
+    exit(1)
+
 # Collect Python files
 python_files = collect_python_files(args.directory)

+# Check if any files were found
+if not python_files:
+    print(f"Warning: No Python files found in {args.directory}")
+
 # Group files
 groups = group_by_directory(python_files, args.num_files)

 # Save grouped files
 save_grouped_json(groups, args.output)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
# Collect Python files
python_files = collect_python_files(args.directory)
# Group files
groups = group_by_directory(python_files, args.num_files)
# Save grouped files
save_grouped_json(groups, args.output)
print(f"\nTotal files collected: {len(python_files)}")
print(f"Split into {len(groups)} groups")
args = parser.parse_args()
# Validate input directory
input_dir = Path(args.directory)
if not input_dir.exists() or not input_dir.is_dir():
print(f"Error: Directory '{args.directory}' does not exist or is not a directory")
exit(1)
# Collect Python files
python_files = collect_python_files(args.directory)
# Check if any files were found
if not python_files:
print(f"Warning: No Python files found in {args.directory}")
# Group files
groups = group_by_directory(python_files, args.num_files)
# Save grouped files
save_grouped_json(groups, args.output)
print(f"\nTotal files collected: {len(python_files)}")
print(f"Split into {len(groups)} groups")



if __name__ == "__main__":
main()
5 changes: 5 additions & 0 deletions dpti/hti_liq.py
Original file line number Diff line number Diff line change
Expand Up @@ -503,6 +503,9 @@ def compute_task(
if "copies" in jdata:
natoms *= np.prod(jdata["copies"])
fe, fe_err, thermo_info = post_tasks(job, natoms)
print(
f"hti_liq.compute_task: ideal gas fe = {fe:.68}eV, err = {fe_err[0]:.6f}eV, {fe_err[1]:.6f}eV per atom"
)
_print_thermo_info(thermo_info)

info = thermo_info.copy()
Expand Down Expand Up @@ -555,6 +558,8 @@ def compute_task(
info["pv_err"] = pv_err
# info['de'] = de
# info['de_err'] = de_err
info["e0"] = fe
info["e0_err"] = fe_err
info["e1"] = e1
info["e1_err"] = e1_err
with open(os.path.join(job, "result.json"), "w") as result:
Expand Down
7 changes: 4 additions & 3 deletions dpti/ti.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,10 +108,10 @@ def _gen_lammps_input(
ret += "compute allmsd all msd\n"
if ens == "nvt":
ret += "thermo_style custom step ke pe etotal enthalpy temp press vol c_allmsd[*]\n"
ret += "thermo_modify format 4*8 %20.6f\n"
ret += "thermo_modify format float %20.6f\n"
elif "npt" in ens:
ret += "thermo_style custom step ke pe etotal enthalpy temp press vol c_allmsd[*]\n"
ret += "thermo_modify format 4*8 %20.6f\n"
ret += "thermo_modify format float %20.6f\n"
else:
raise RuntimeError(f"unknow ensemble {ens}\n")
ret += "dump 1 all custom ${DUMP_FREQ} traj.dump id type x y z\n"
Expand Down Expand Up @@ -669,7 +669,8 @@ def post_tasks(
"all_fe": all_fe.tolist(),
"all_fe_stat_err": all_fe_err.tolist(),
"all_fe_inte_err": all_fe_sys_err.tolist(),
"all_fe_tot_err": np.linalg.norm([all_fe_err[ii], all_fe_sys_err[ii]]).tolist(),
# "all_fe_tot_err": np.linalg.norm([all_fe_err[ii], all_fe_sys_err[ii]]).tolist(),
"all_fe_tot_err": np.linalg.norm([all_fe_err, all_fe_sys_err]).tolist(),
}

# data = [all_temps.tolist(), all_press.tolist(),
Expand Down
3 changes: 2 additions & 1 deletion dpti/ti_water.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,7 @@ def handle_compute(args):
elif path == "p":
args.To = jdata_hti_in["pres"]
if args.inte_method == "inte":
ti.post_tasks(
ti_info = ti.post_tasks(
job,
jdata,
args.Eo,
Expand All @@ -170,6 +170,7 @@ def handle_compute(args):
scheme=args.scheme,
shift=args.shift,
)
return ti_info
elif args.inte_method == "mbar":
ti.post_tasks_mbar(job, jdata, args.Eo, natoms=nmols)
else:
Expand Down
18 changes: 18 additions & 0 deletions dpti/workflows/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
pass
# import importlib

# def _import_module(module_name):
# try:
# return importlib.import_module(f'dpti.workflows.{module_name}')
# except ImportError:
# return importlib.import_module(f'.{module_name}', package='dpti.workflows')

# MODULES_TO_IMPORT = [
# 'job_executor',
# # 'another_module',
# # 'yet_another_module',
# ]

# # dynamically import sub modules
# for module_name in MODULES_TO_IMPORT:
# globals()[module_name] = _import_module(module_name)
Comment on lines +1 to +18
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Commented-out module import system needs cleanup

The file contains a commented-out dynamic module import system with only a pass statement active.

Either:

  1. Uncomment and implement the dynamic module import system if it's intended to be used
  2. Remove the commented code if it's not needed

The current state creates confusion about whether this functionality is intended to be used or not. If this is a work in progress, add a TODO comment explaining the plan.

-pass
-# import importlib
+import importlib
 
-# def _import_module(module_name):
-#     try:
-#         return importlib.import_module(f'dpti.workflows.{module_name}')
-#     except ImportError:
-#         return importlib.import_module(f'.{module_name}', package='dpti.workflows')
+def _import_module(module_name):
+    try:
+        return importlib.import_module(f'dpti.workflows.{module_name}')
+    except ImportError:
+        return importlib.import_module(f'.{module_name}', package='dpti.workflows')
 
-# MODULES_TO_IMPORT = [
-#     'job_executor',
-#     # 'another_module',
-#     # 'yet_another_module',
-# ]
+MODULES_TO_IMPORT = [
+    'job_executor',
+    # Add other modules as they become available
+]
 
-# # dynamically import sub modules
-# for module_name in MODULES_TO_IMPORT:
-#     globals()[module_name] = _import_module(module_name)
+# dynamically import sub modules
+for module_name in MODULES_TO_IMPORT:
+    globals()[module_name] = _import_module(module_name)
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
pass
# import importlib
# def _import_module(module_name):
# try:
# return importlib.import_module(f'dpti.workflows.{module_name}')
# except ImportError:
# return importlib.import_module(f'.{module_name}', package='dpti.workflows')
# MODULES_TO_IMPORT = [
# 'job_executor',
# # 'another_module',
# # 'yet_another_module',
# ]
# # dynamically import sub modules
# for module_name in MODULES_TO_IMPORT:
# globals()[module_name] = _import_module(module_name)
import importlib
def _import_module(module_name):
try:
return importlib.import_module(f'dpti.workflows.{module_name}')
except ImportError:
return importlib.import_module(f'.{module_name}', package='dpti.workflows')
MODULES_TO_IMPORT = [
'job_executor',
# Add other modules as they become available
]
# dynamically import sub modules
for module_name in MODULES_TO_IMPORT:
globals()[module_name] = _import_module(module_name)

Loading
Loading